Forgot your password?
typodupeerror

Comment Re:Research specs carefully first. (Score 1) 147

OnVIF support is very helpful for obtaining the URLs of the RTSP streams and JPEG snapshots, if available. But even better would be just replacing the original firmware with a cleaner design.

The starting point for this would be identifying the hardware platform of the camera. While there's lots of variance, many different cameras are based on the same generic SoCs and sensors and the same reference PCB designs. (Also, those Chinese camera modules are typically generic enough to feature unpopulated connectors for a microphone or PTZ motors — the capability to connect those is there, and might even be fully active in the original firmware, regardless of whether the camera was sold with those features. They also tend to use lots of interchangeable components, such as lenses or mechanically-compatible IR LED boards in a handful of different designs, which means you can replace those LED boards with generic spares when their IR LEDs eventually go out.) Sadly, manufacturers or sellers often don't tell in the listing which SoC or sensor they're using in their camera. Some rare AliExpress vendors might, but most don't. You could always try to ask from the seller, though.

But if you are able to obtain this information before buying (unlikely), or just buy something and open the case and happen to get lucky (your SoC and sensor are supported), you might be able to replace the Chinese crap^H^H^H^Hfirmware with an OpenIPC-based firmware. See here for some screenshots of the web-based management interface:

https://openipc.org/web-interf...

WiFi cameras are hopeless, though — especially if you have more than just one or two. They'll just be congesting whichever WiFi channel they're on and shouting over each other, causing collisions and making for a very lousy network performance. Wired Ethernet + PoE (power supply over the Ethernet cable) is the way to go.

If one must use a wireless link to cross a distance (say, over a large yard — maybe from an outbuilding to the main building) and have a need to install more than just one camera in the remote location, buy a separate, dedicated "WiFi bridge" device pair. Then wire the cameras in the remote location with Ethernet/PoE, and pass all their traffic over this single WiFi link (where their streams will be orderly interleaved), instead of utilizing the WiFi radios of the cameras themselves (those are better turned off). Performance will be much better this way.

Comment Re:I get it. (Score 1) 70

I know exactly what he’s talking about. That’s why I got an old-school Trinitron (which someone was giving away for free) and now treat it with the greatest care.

Here’s a great example: https://www.reddit.com/r/inter...

That is indeed an excellent illustration of why ”perfect rectangles” is not a desirable representation of pixels when simulating computer graphics originally intended for display on a 15 kHz CRT TV or video monitor.

In some comments to this article, though, there is maybe some confusion over what you'd actually want to see in a CRT simulation. Granted, there were many different types and generations of CRT displays. They varied in terms of their accepted horizontal and vertical timings, the dot pitch of their shadow mask or the density of their aperture grille, their dynamic range, their effective resolution, the persistence of their phosphors, and the details of their electron gun/beam control. There were also many different signal types and color encoding standards implemented for driving the display — some of which generated their unique artifacts.

But if you do not fetishize about the artifacts as such (such as RF noise, or hum bars slowly traveling across the screen, or adjacent colors affecting each other in unintended ways) but just want the nicest-looking, most authentic and crisp picture you could have hoped to achieve back in the day, it is very easy to specify the gold standard that the emulation/simulation should strive to target:

It is something akin to a Commodore Amiga (any model) driving a TV-compatible 15kHz computer/video monitor — such as the Philips CM8833 or the Commodore 1084S — using RGB signaling. No more, no less. Make it look the same and you’re golden.

What this means in practice is:

1. Forget about EGA, VGA, and SVGA. They’re not relevant to the types of signal sources we’re talking about. (8-bit and 16/32 bit home computers and game consoles of the 1980s and early 1990s) The monitor whose display characteristics you want to simulate should be a CGA computer/video monitor; suitable for both computer and video work (that is, also capable of being used as a CCTV monitor, or for video editing using VCRs).

The monitor needs to be compatible with the standard-definition 625/50 and 525/59.94 interlaced video rasters, and their half-the-vertical-resolution, non-interlaced (progressive) variants, with a 15kHz horizontal scan rate. These are the signal types that were conventionally produced by the hobbyist/home computer systems and video game consoles of the era.

Such monitors are generally close to TVs in their design (e.g. they feature a PAL or NTSC color decoder for decoding CVBS and Y/C signal types) but lack an RF tuner. They have more user-accessible picture controls and support more (professional) input signal types (RGB or Y'PbPr), along with consumer video signals, and are capable of displaying higher effective resolution (discernible ”TV lines” — that is, horizontal resolution on a scanline) than your typical (portable) CRT TV of similar size.

2. Make sure that you attempt to simulate scanlines, and align the input video raster (scanline raster) to the pixel matrix of the target display. This immediately leads to a specific technical restriction on the output side: the vertical resolution of the simulated CRT image must not be allowed to scale freely on an LCD or OLED display, but only in discrete steps that are multiples of the number of active scanlines in the original input signal. (e.g. 288, 576, 1152, or 1728 for a 625-line signal and 240, 480, 960, and 1440 for a 525-line signal.) Otherwise you will run into aliasing artifacts.

For instance, your starting point would be the original vertical (active) resolution as is. Or, more accurately, half the nominal vertical resolution of a standard, interlaced video signal since the home computers and game consoles of the 1980s commonly outputted a slightly non-standard, non-interlaced signal where the TV will draw the consecutive fields effectively as progressive frames, with no half-line displacement between the ”even” or ”odd” fields. At this stage, there can be no scanline emulation as there is no room for it. Each scanline will occupy a single pixel row on the target display.

A step up from this is doubling the original vertical resolution and adding a dimmer, interpolated line in-between any original scanlines. (Just leaving every other line blank is too drastic, and not really how the image looked liked on a CRT. Typically, the scanlines are thicker than the inter-scanline gap.)

Another step up would be reserving two horizonal pixel rows per original scanline, and a dimmer line (row) in-between them, to mark the inter-scanline gap. Now we’re at three times the original vertical resolution.

Thereafter — as the resolution allotted for each scanline will increase in discrete steps — you may start simulating the actual beam shape with increasing accuracy. This would involve simulating the rounded/dimmed/tapering edges and blooming at points where the beam switches on or off or increases or decreases in brightness. Such things help recreate the smooth edges of text and pixel art and the overall, scanline-based ”video raster look” of CRT graphics.

3. Once your scanlines are in order, make sure that you maintain the correct aspect ratio by interpolating the horizontal resolution in lock-step with the vertical resolution. Note that the correct aspect ratio of a video raster (as seen on a properly-adjusted TV, or a TV signals-compatible video monitor), is dictated solely by the EBU and SMTPE video / TV signal timing standards. That is, the shape of the image and its pixels is fully determined from the pixel clock producing the signal, and how it relates to the standard TV timings.

The nominal picture shape of the era was 4:3 but the shape of the actual picture you see on the screen depends on the device generating the signal, and how it paints the pixels relative to the so-called ”active picture” area and sync pulses defined by the TV standards. Often, the image produced by a computer or video game console of the era does not fully cover the entire ”active picture” of the video signal (as specified by standards), and therefore does not necessarily represent an exact 4:3 shape, but may have some other rectangular shape centered within the 4:3-shaped ”active picture”.

4. Also note that standard-definition TVs and video monitors are assumed to overscan the active picture. Some of the picture will go over the edges of the CRT screen and be cut off.

This was originally because the early CRTs were rather roundish in their shape, so the corners of the signal got cut off ”by design” so you could fill all of the surface area of the tube with image. Also, early CRT electronics could not hold the image all that stable and its size often fluctuated based on the overall image brightness and also drifted over time due to the components aging unless you readjusted the geometry potentiometers. This made it impossible to keep the edges of the active picture accurately aligned to the edges of the CRT.

Yet another reason for overscan was that analog production technology (analog TV/video cameras, analog VCRs/VTRs, etc.) could not guarantee a clean signal to the very edges of the nominal active picture. There was video head switching noise and other artifacts hidden behind the edges, so it was desirable to let the CRT ”cut” the edges and shape the visible image in a clean way.

Early TV-connectable home computer and video gaming systems countered overscan by generating rather big borders around the picture, to keep the action centered within an area that is actually visible on the tube. The later systems (such as the original Xbox, and optionally the Amiga) embraced overscan, instead: they could fill the active picture area of the signal to the very edges — with no borders. But while doing so, they also had to follow the same safe area guidelines that broadcasters adhered to when positioning text or graphics or important action.

A proper simulation should both simulate overscan and include an option to switch it off (switch into an ”underscan mode”), with the overscanned mode being the default, as it was on a normal TV or on a properly adjusted video monitor.

5. There are some more interesting and esoteric things one could attempt at doing when simulating the display of interlaced signals (in a truly interlaced fashion), or the persistence of phosphors on a CRT — especially given a high enough framerate (preferably an even multiple of the original vertical refresh rate). Doing things accurately would likely require profiling an actual 15kHz CRT monitor with a special high-speed camera.

6. Given a very high-resolution display, it might become possible to simulate an aperture grille or shadow mask patterning at last to some artificial extent where you align the simulated mask exactly to the pixels of the output device so a not to introduce interference patterns but do not align it exactly to the simulated electron beam.

7. Gamma and other characteristics of the CRT color primaries/phosphors could also be modeled and simulated to the degree possible. Some emulators, for instance, used to have rather bad palettes which in no way matched what you would have seen on a CRT.

8. There are likely to be analog overshoots and ringing in the driving video signal and its interaction with the electron beam control, as well as some filtering and electromagnetic effects that produce some subtler characteristics of a CRT image. Modeling these would require a good understanding of the inner workings of a CRT-based display and the signals driving the circuitry.

All of this should lead to an image that makes e.g. the text of a command-line shell look as smooth as it was on a CRT, instead of being a jaggy collection of lego blocks. But the look of a video raster is a very different thing from just simply blurring or interpolating the original pixel data — so if one thinks mere blurring should do the trick, that is a very mistaken assumption.

Then again — and to get back to where we started — I do not think simulating artifacts such as RF noise or interference, or shadowy ghost images, or PAL or NTSC color artifacting gains much — except as a gimmick that you will try out once, then switch off forever. Such artifacts were undesirable back then and they are also (mostly) undesirable now.

What you want is rather a simulation of a clean signal driving the CRT. Back in the day, everyone would have wanted to get as clean a signal on their CRT as possible, with no extra artifacting caused by the inherent limitations of some signal paths, but of course displayed in the technical manner video raster was supposed to be displayed on a CRT. RGB signal, where natively produced by the computer or game console, and where available as a supported input option on your monitor, was the holy grail of picture quality, also for TV-signals-compatible devices. Some cheaper systems just were not designed to output their video as RGB signal since they primarily targeted users who were assumed to use domestic TVs as their displays instead of the more professional and more expensive CGA computer/video monitors. And, in many markets, domestic TVs also did not feature an RGB input.

Europe was lucky in this regard, though, since the French insisted on getting their SCART connector on the TVs sold here, and the SCART connector specified analog RGB input pins along with composite video. So if your signal source supported RGB in the way e.g. the Amiga and many fourth or later generation home video game consoles did, you could get a very clean picture even on a domestic CRT TV.

(Later on, the ubiquitous RGB support on European TVs also helped with DVD players and digital set-top boxes, which in their European versions commonly included a SCART connector and outputted RGB signal, producing as crisp an image as the CRT was technically capable of displaying.)

(OK, NTSC color artifacting was used in some early systems as an indirect, cheap way of creating a color signal without including ”actual” color support, so it is desirable to optionally simulate it in some cases — but its usefulness is limited to the specific systems and software titles that made use of this approach. For any other titles, you'd rather switch off such color processing and treat the input as if it were monochrome or RGB signal, getting a clean image that is free of accidental color artifacts.)

Yet another thing is, of course, the physical size of the image. In the 1980s and early 1990s, the CRTs of RGB video monitors and portable TV sets were around 14 inches in size. Domestic CRT TVs maybe weighed in at 25 to 32 inches. Factor in the typical viewing distances and you'll see the viewing angle was quite narrow-ish compared to the modern, gigantic screens filling your entire field of vision.

There are some reasonable practical limits to how thicc a scanline could be (or should be in its simulated reproduction), and what is its proper relation to the size of the entire video raster, and how big a part of your field of vision the original images would have covered. If you go much beyond those limits — blow up the image to cover a too large part of your field of vision just because your screen is so big — the pixel art, with its now fist-sized pixels, will no longer appear the way it was originally assumed to be displayed.

Submission + - SQLite Adopts Monastic Code of Conduct (sqlite.org)

An anonymous reader writes: Undoubtedly in response to this politically motivated sort of claptrap, SQLite has released their own Code of Conduct. From the preamble:

Having been encouraged by clients to adopt a written code of conduct, the SQLite developers elected to govern their interactions with each other, with their clients, and with the larger SQLite user community in accordance with the "instruments of good works" from chapter 4 of The Rule of St. Benedict. This code of conduct has proven its mettle in thousands of diverse communities for over 1,500 years, and has served as a baseline for many civil law codes since the time of Charlemagne.


Comment Re:Probably the home router... (Score 3, Informative) 574

As it stands, your carier does NAT themselves and gives your router one IP address, typically in the 10.0.0.0/8 address space. Your home router then does another layouer of NAT, and gives internal devices their own IP address range in the 1902.168.1.0/16 address space.

Not where I live, and that sounds quite limiting! Thank ${DEITY}, ISPs here in Finland assign their customers genuine public IPv4 addresses, usually via DHCP. Typically, you can even get several of them – the maximum on a consumer connection could be something like 5. (I’m using 2 right now.) Only something like the port 25 (SMTP) is blocked for inbound connections so you’re free to run a personal web server, SSH box, VPN to your home network, etc.

Finnish cellular carriers – as opposed to the actual fiber/copper/cable ISPs – have a different practice, though: they will usually NAT the 3G/4G customers by default, which is quite understandable, as you generally do not want inbound connections to a cellphone. Still, at least my carrier (Saunalahti) lets advanced customers choose a different APN which will give a public IPv4 address even for a 3G modem or a cellphone, which is quite nice and handy as well for some situations.

Comment Re:By Year... (Score 1) 219

Applications (Office, Photoshop, etc) have a very short shelve life. Anything over a couple of years old is useless. Languages (Perl, PHP, Ruby); throw away after a decade or so. It differs though; old C books may still apply, old Java books less so. Theory (algorithms, methodologies); should be good for a long, long time.

Manuals for old, unique (non-PC) hardware platforms, peripherals, and programming environments may still have relevance to them, though.

There’s always the retrocomputing/historical angle where you’d want to preserve books such as a register-level hardware reference explaining the capabilities of an old 8-bit home computer model for an aspiring programmer, or a system administration guide for an old minicomputer, or programming manuals (entry-level or not) for such systems. Or user manuals for the seminal applications which were driving the sales of such systems.

Also: some people are very interested in preserving old product catalogs with pictures and technical information of what was available for such systems back in the day.

Comment Re:So... no separation between system and userspac (Score 5, Informative) 335

It was also single-user, was it not?

That is correct. Single-user designs were the norm with personal computers of the era. There are some ways around this (, for example) but they're sort of limited.

The lack of memory protection is due to the first models being designed around the plain Motorola 68000 CPU, which lacks a memory-management unit (MMU). Later models were available with beefier and more feature-rich processors from the 680x0 series, some of the including an MMU. You could also buy add-on “turbo cards” (processor cards taking over the functions of the main CPU, effectively replacing it with a faster one.) But by then it was too late. The OS relies heavily on shared libraries and message passing in flat, shared, unprotected memory space.

Otherwise, the Amiga hardware platform and AmigaOS – the first model/version having been released in 1985 – included concepts such as preemptive multitasking, windowed GUI toolkit in the system ROM (no “text mode” at all), overlapping “screens” in different resolutions and bit depths, hardware blitter and DMA-based I/O (including multichannel sampled stereo sound), drivers for devices and filesystems, the “AutoConfig” system for add-on devices (fulfilling the same role as PnP did later in the Wintel world), 8-bit ISO Latin-1 character encoding as standard, windowed command-line shells, shell scripting, inter-process scripting (ARexx), an OS-provided framework of multimedia “datatypes” (handlers/decoders/players for common file types), scalable fonts, clipboard, speech synthesizer as a standard OS feature, etc.

Ignoring Linux and OS/2 for a moment, in some ways it felt the Wintel camp only caught up ten years later when Windows 95 was released to the masses, and at that point, both the OS and the “IBM-compatible” PC hardware platform were still missing some key features and novel ideas that made the AmigaOS so great and elegant in its day.

Comment Re:5 1/4 HD's (Score 1) 195

One of the Compaq mid-tower lines used those drives. Quantum Bigfoot. I worked at Computer City at the time, and every time one of those towers came in for service, it was for a bad drive. [...] I always wondered if the problem was that the size of the platters just made them too unstable, or if the manufacturing process had flaws.

Quantum BigFoots (some early models) had a known data corruption problem which could be prevented (altogether, or from getting worse) by applying a firmware uprage from the revision A01.02 to A01.03, or later. Alas, a later version of the firmware cannot fix a drive which already has corrupted areas on its platter: it can only prevent further damage.

Comment A couple of semi-useful use cases (Score 1) 359

I wouldn't pay much extra specifically for getting a touch screen on an ordinary laptop, but if it's there, I don't mind. (I recently bought an Asus laptop with one but it wasn't a factor in that purchase at all.)

There are two use cases where I've found the laptop touch screen kind of useful:

1) Developing and testing touch screen apps, or ideas for those, for touch-screen based devices which are not laptops. Pretty much a no-brainer. It's kind of neat to be able to do that on a your ordinary, "serious" machine even if you're never actually going to use the app that way, outside testing.

2) One-handed browsing sessions. (Yeah, but no - not necessarily only those. More like browsing something while eating or enjoying a hot cuppa with your other hand holding a sandwich, or the cup.) Actual tablets do this use case better, but if all you've got in front of you is an ordinary laptop and you simply want a new screenful to read, would like to follow a link or zoom in a bit or pan around, poking the screen with your non-occupied fingers is sometimes the most straightforward and effortless thing. Especially if it is also your non-dominant hand which you don't normally use for controlling the mouse pointer.

Comment Re:CG NAT is not new! (Score 2) 338

Your cell carrier doesn't count as an ISP for your smartphone? You don't get a publicly routable address on any cell network I've used.

At least Saunalahti in Finland offers publicly routable IPv4 addresses to their mobile customers. You have to activate the feature in the self-service portal and use the correct APN so generally only those who know what they're doing would do it, but it is all documented on their website. The feature is free of charge.

Comment Similar projects in other EU countries (Score 1) 178

Similar community-driven projects have been carried out in other EU countries, such as Finland.

Here’s one such example from the region that geographically centers around Töysä – a small rural community of 3,000 people – and its neighboring towns/municipalities, some of which are a bit larger, but not much:

Verkko-osuuskunta Kuuskaista (The Network Co-operative Kuuskaista)

6net+ core network (a PowerPoint presentation)

Comment Re:Please include flash! (Score 1) 181

And yes, YouTube's autoplay is very annoying, espescially if you open the video in a new tab. I wish I knew how to just have it disabled globally.

There’s a GreaseMonkey “user script” called YousableTubeFix. If installed, it helps getting rid of many YouTube annoyances – including the completely needless autoplay feature.

Comment Re:Under-appreciated (Score 1) 704

Under-appreciated, you say?

On CP/M:

  • WordStar, one of the most influental word processors of its time. Even today, several character-mode text editors make use some of the shortcuts which originated on WordStar.
  • The CP/M operating system itself, which was quite popular back in the day and gave inspiration to PC-DOS/MS-DOS.

On the Commodore 64:

  • GEOS by Berkeley Softworks. Who would have thought the venerable C64 could host a GUI system almost making it comparable to the first Macintosh models (not quite, but suprisingly close, given the 8-bit processor, memory limits etc.) There was a host of serious productivity applications for this environment.
  • PageFox: a desktop publishing system for the C64.
  • Microrhythm: a digital drum machine based on the undocumented sample playback features of the SID chip.
  • The SID audio chip, which was way more feature-rich than its competitors of the time, and in some ways comparable to a “real” synthesizer, giving actual character and resonance to computer music, instead of just beeps and blips. Its creator, Bob Yannes, later went on to found Ensoniq, a company which designed and manufactured actual musical instruments (keyboards, samplers, etc.) The SID was a unique piece of audio hardware which enabled the musical software of the C64 to do its magic – and its legacy still lives on in the form of numerous emulators, vast sound archives and libraries (such as HVSC), custom-built musical instruments based on the chip (such as the SIDStation), etc. This is one of those cases where a piece of hardware has been inspirational and influental and enabled a number of software applications which would have been pointless if it weren’t for the hardware.

On the Amiga platform:

  • (The Ultimate) SoundTracker by Karsten Obarski, later followed by the even more popular, more advanced clones or derivatives: NoiseTracker and ProTracker. These started the whole computer music “tracker” genre as we know it today – with four sound channels in a stereo arrangement and digital instrument samples, no less.
  • Audio Master, one of the first digital audio sample editors for an affordable personal computer. Supported stereo sound as well. (Often accompanied by inexpensive audio digitizers attached to the printer port.)
  • Deluxe Paint by Dan Silva of Electronic Arts – the first paint program for the Amiga. Taking a different approach from its predecessors on other platforms (which were mostly toys), the Deluxe Paint was a very powerful bitmap graphics art package, featuring advanced multi-color blitter-enhanced free-form brush handling features and color cycling effects, and used for creating graphics for innumerable games and demos – also when developing for other platforms than the Amiga. Later gained animation features as well. Still a great paint package for creating graphics for any old computer with paletted graphics modes. Also introduced the EA IFF format, which was then adopted to the Amiga platform wholesale as the standard structured format for many types of files. It was later also adopted to Windows in a bit modified form (RIFF – for example, the so-called “wav files” are in RIFF format), but not as widely.
  • Photon Paint: A paint program making use of the photorealistic 4096-color “HAM” mode.
  • Real 3D. One of the first CSG solid modeling ray-tracing packages for personal computers. Supported smooth-curved quadric surfaces (instead of the jaggy polygons of its contemporary competition), calculated reflections and transparency in materials in a physically accurate, realistic way (a lens would actually function as a lens), and made good use of the Amiga’s 4096-color “HAM” graphics mode.
  • NewTek’s Video Toaster: the first studio-quality genlock, video titling / paint box / CGI / visual effects / animation toolkit, and video mixing system, implemented on top of an inexpensive, generic-purpose multimedia computer platform with the aid of the supplied video mixing hardware. (Also gave birth to the LightWave 3D modeler/renderer.)
  • Scala Multimedia – the versatile video titling and info TV application, tightly synchronized with the vertical refresh of the TV-compatible video modes, so you could get absolutely silky-smooth, professional-looking scrolling and wipes. (Unlike in PowerPoint etc. where they will always stutter.)
  • The AmigaOS itself: the first widely available, mouse-driven, windowed, pre-emptively multitasking GUI operating system, which came standard with an affordable personal computer. One of the unique features of the AmigaOS and its supporting hardware was the ability to run applications in multiple different graphics modes (pixel clocks, bit depths) yet display them at same time on the screen.
  • The Amiga platform itself: the first true “multimedia” computer which was released before the term “multimedia” was coined. (Pre-emptive multitasking, multi-channel stereo sound based on digital sound sampling and DMA playback, speedy hardware blitter and a simple co-processor, photorealistic-at-its-best graphics and animation capabilities, compatibility with TV standards, built-in support for synchronizing its video raster to an external video source with the help of a genlock, etc.) On other platforms of the same era, these features would not have been available to the software applications.

Probably forgot to list all too many interesting things, but there you have it. For some applications, I find it difficult to separate the hardware from the actual application. Ground-breaking computer applications often needed never-before-seen groundbreaking computer hardware to support them, which was mostly the case with the Amiga, for example.

Comment Re:If it ain't broken... (Score 1) 160

I can't get over how fast that client is... after sitting through broadcast Teletext and waiting for pages to cycle round, it's interesting to see how much more usable the system becomes when it's responsive.

Many modern TV sets (from mid-to-late 1990s and onwards) and DVB set-top boxes can cache all Teletext magazines – including subpages – which makes browsing the content a breeze.

Comment Re:If it ain't broken... (Score 1) 160

The DVB standard ETSI ETS 300 743 defines a method for transmitting Teletext data over a DVB (MPEG-TS) stream. This method is used in the DVB countries which maintain the old Teletext service alongside with the digital broadcasts. DVB set-top boxes and TV sets with an integrated DVB receiver/decoder commonly include a Teletext browser.

UK has chosen to abolish Teletext in favor of an MHEG-5 based information service – known as the “red button” service by the viewers. However, this does not mean it wouldn't have been technically possible to carry the full Ceefax Teletext service over DVB broadcasts in the UK as well: the local broadcasting companies just chose not to do it.

Broadcasters in some countries – such as Finland and Italy – have experimented with providing a similar information service through the more ambitious, Java-based MHP, instead of MHEG-5. But in Finland, at least, these experiments have failed due to unenthusiastic response from the STB makers and the general public. (The MHP services were probably introduced at a too early stage, and it was partially a PR failure as well.) Instead, regular Teletext service is still being broadcast on many local DVB channels using the method defined in ETSI ETS 300 743.

Comment Re:Small Market (Score 2) 284

I wouldn't think there would be a big market for movies subtitled in Finnish - even in Finland I think most people can understand other languages (like English, or French, or German or the other Scandinavian languages)

Nearly all foreign TV shows and movies which are shown on Finnish TV channels are subtitled in Finnish – that’s the norm here. (Dubbing is normally never used here except for the content intended for kids under reading age. Also, the narrated sequences in some documentaries are sometimes re-narrated in Finnish whereas their on-screen dialogue remains subtitled. But those are pretty much the only exceptions.)

The same practices go for actual movies seen in a movie theater, and the shows and movies released on DVDs or Blu-Rays.

DivX Finland is mostly providing subtitles for the purposes of watching shows and movies which are not (yet) made available in Finland through official channels, or – as it might be the case with some more obscure foreign TV shows, for example – never will.

Some individuals who are fluent in some particular foreign language – usually English in the Finnish context – take pride in watching shows or movies of that language without subtitling. Yet, the norm here – which is also reflected in the default settings of the set-top boxes and TV sets – is that subtitles for translation are always on and visible. (The local TV channels show a lot of foreign content which specifically calls for translation. It just wouldn’t fly if they tried to provide it to the Finnish-speaking audience “as is”, with no effort to translate. People expect the translations to be there.)

Some Finnish-language shows may also have subtitles in Finnish for the benefit of the hard of hearing. But these subtitles, which are not about translation but transcription, are never seen by the normal folks as you need to fiddle with the settings of the TV to receive them. And only those who have the need will want to see them on their screen anyway. (There is also no legal requirement to provide such Finnish-on-Finnish transcription service, so the availability of such special-needs subtitles is pretty much limited to some select shows produced by YLE, the local public broadcaster.)

Slashdot Top Deals

You may call me by my name, Wirth, or by my value, Worth. - Nicklaus Wirth

Working...