At Home Video : An Amazing Revolution, part 2 (the 2000s and beyond)

Whether modular or single units, television monitors are now available in unimaginably huge sizes and with stunningly bright clear resolution.

This is the second part of our article on the amazing evolution of at-home movie watching.  It actually starts in the very late 1990s, with the introduction of new higher-resolution digital televisions, and continues on to the present day and the likely future.  Our first part, which covers the period through the DVD era, can be seen here.

Home Video Part 4 :  The Television Set Gets Upgraded Too – Digital Flat Screens

The quality of DVD exceeding the quality of television sets of course created a technological gap and consumer demand that the manufacturers were racing to fill.

Video display technology was being aggressively improved at the same time.  Several factors contributed to this.  It was driven in part by computers and their video display abilities/needs.  When color computer monitors first came out, their resolution was “CGA” – 320 x 200 pixels, something easily handled by television monitor technology, but the EGA resolution of 640 x 350, which came out in 1984, was already pushing standard television resolution to its limits, and extensions to the EGA format, up to 720 x 540, were above broadcast television quality capabilities.

There were also new higher definition digital television standards as well as computer display standards being developed, and would show much better than traditional “NTSC” and related (PAL & SECAM) analog format video, needing new televisions.

Another and important factor was the desire of television set manufacturers to come up with something new and better to encourage people to keep buying televisions.  The television set manufacturers had enjoyed several booms to date – first the initial sell-in of black and white televisions since the introduction of broadcast television in 1939, then the introduction of color sets from 1953, then steady improvements in picture quality and increases in picture size, then price drops that encouraged families to buy a second (and even third/fourth/etc) set, but they’d gone about as far as they could go within the constraints of the analog NTSC video format.

Digital television would also allow more television programs to be squeezed into less radio bandwidth, another important benefit with the growing squeeze on radio frequencies.  A semi-international digital standard was agreed upon in 1993, and in 1996 the Telecommunications Act required all television signals to be in digital format by 2006 (subsequently delayed to 2009).  In anticipation of this, new digital televisions started appearing, offering high resolution than the older analog sets.  Instead of somewhere between 250 and 500 lines of video resolution, the first generation of digital televisions started offering 720 pixels (what was formerly called lines) of resolution.

With a new digital standard finally resolved in 1996, new digital sets with wider screens (16:9 ratio, compared to the earlier 4:3 – is 12:9 – ratio) and 720 pixels of resolution quickly followed.  In 1997, Sharp and Sony introduced the first large flat screen TV. It had a 42″ screen size, which at the time was larger than any other domestic screen offered, and was priced at $15,000 (about $25,500 in 2021 dollar terms).  A 43″ television today can cost less than $255 – a one hundred fold reduction in cost in the last 24 years.

The first set wasn’t popular (because of its price), but prices started to quickly drop, and the original 720 pixel resolution sets were followed by 1080 pixel resolution (1080×1920), which in its 1080p (progressive rather than “i” for interlaced picture) form became the best type of television available and capable of better-than-DVD video inputs.

One other point about the higher resolution televisions.  Higher resolution allowed for bigger screens, and that was a desirable improvement both for the manufacturers and for consumers, eager to enjoy bigger screen experiences in their living rooms (and bedrooms, kitchens, etc).

At the same time that the picture standard was evolving, so too was the technology to display it.  Until the time of the new digital wide-screen formats, television sets relied upon a very heavy and bulky glass cathode ray tube (CRT) on which they displayed their image.  These required a lot of depth for the image to be projected, and a lot of weight.  The depth factor was becoming a constraint – it was impossible to make larger than about 40″ screened televisions and still have them fit through doorways in a regular residence.  The weight was also a concern – today a 40″ flat screen television weighs about 15lbs (and might be as little as 3″ deep).  The 40″ Sony XBR CRT  television weighed over 280lbs.

Several different new types of technology were being developed to allow for the larger screened televisions that people wanted.  Plasma screens were among the first types of screens to appear, and then there were projection televisions that would project images from tiny screens, via a lens, to a large viewing screen.  The projection systems were still bulky and heavy, but allowed for larger screens than would have been possible with CRT technology.  The plasma screens were great, but became increasingly expensive compared to other methods of creating a large viewing image, and had some problems with longevity.

By the mid 2000s, most televisions no longer used plasma technology.  New LCD technology was cheaper, thinner, and lighter.

Home Video Part 5 :  Recorded Video Catches Up With New TV Capabilities – Blu-Ray

We opened part four of this article at the point where, for the first time, consumer priced recorded media offered better quality video than television sets were capable of displaying.  That changed and by the mid-2000s, more and more people were replacing their older CRT analog televisions with digital flat panel televisions – not only because they wanted to, but because they needed to, in order to be able to receive the new digital television transmissions that were to be the only type of television broadcast by the mid/late 2000s.

This meant that, again, people had displays that were more capable than the DVDs being played and shown on them.  Which meant it was time for the other half of the equation (the recording standards and playback media) to catch up with the capabilities of the monitors people had in their homes.

This happened in the form of a newer type of DVD.  It had the same physical size and appearance as the existing DVDs, but used a different type of recording standard that allowed for greatly improved resolution and quality.  A DVD was limited to 480×720 pixels or 525×720 pixels depending on the regional format being used (in the US, the limit was 480×720) and displayed the video in an “interlaced” format – half the picture was shown/updated, then the other half, then back to the first half again, and so on, done at such a speed it seemed like one steady smooth complete moving image.

This contrasted with many televisions having a resolution ability of 720 x 1280 pixels, and an increasing number of televisions capable of 1080 x 1920 pixels.  There are six times more pixels on a 1080 x 1920 pixel screen than on a 480 x 720 DVD (and actually more in reality due to the difference between interlaced and progressive scanning).

The newer type of DVD is called Blu-ray, and it supports 1080 x 1920 (progressive scanned) video.  At the time of its release in 2006, this was the “gold standard” for video quality.  There was nothing better available for consumers.

But, and for the first time in this narrative, the theoretically six times better image on a Blu-ray was no longer so obviously apparent to a regular person watching a regular program.  Or, perhaps better to say, we were starting to enter the realm of “vanishing returns”.  There were also essentially no improvements in audio quality.

It could also be said that Blu-Ray came out too soon – less than ten years after the introduction of DVD.  People who had only recently upgraded to DVD were reluctant to upgrade another time, and while, after seeing DVD, no-one would ever want to watch a VHS tape again, it was possible to more or less convincingly persuade oneself, at least in the early years of Blu-ray, that DVD was “good enough”.  Even the high end enthusiasts were hesitant, because they might be in a situation where they first bought a VHS tape, then a DVD, then a Blu-ray version of a movie, and perhaps also bought multiple copies – first the regular theatrical release version, then a director’s cut, then an “extended version”, then a special collector’s edition with added extra materials, and so on.

Blu-ray coexisted alongside DVD (and still does, even now, 15 years later) rather than replacing it, and at least for the first few years, Blu-ray discs and Blu-ray players cost a great deal more than the DVD equivalents.  Blu-ray seemed destined to become another laser disc technology, reserved for high end enthusiasts while a less good and less expensive but massively more popular technology continued in parallel.  Price differentials moderated, with Blu-ray discs seldom costing $10 more than a DVD equivalent, and sometimes being the same price or lower.

Blu-ray players also played DVDs (but not vice versa) so when the players came down in price, people would buy a Blu-ray player, even if they continued to buy DVDs to play in it; indeed, DVD-only players became rarities.

Blu-ray would have eventually and inevitably replaced DVD, but for an unexpected new source of competition that started to supersede both DVDs and Blu-rays.

Home Video Part 6 :  Video Goes Virtual – Streaming

The choice for watching at-home video was simple.  One could either rent or buy a movie.  There was no third option – the third option was “go to a theatre”, rather than watch at home, or perhaps “watch on television programming”, but neither of those third options gave you control over what content you watched, and when.  You couldn’t pause a movie in the theatre or on television.  You couldn’t choose the time of day or week when you watched it.

It was even more restrictive with audio.  There was only one choice.  Buy a CD.  End of story.

Except that, by this time (about the mid 2000s) there were other choices for audio appearing.  You could buy individual songs on-line (initially through iTunes), and you could listen to those through the internet, via audio streaming.  This concept evolved to access to vast libraries of music, which you could access online (but not download to keep), in return for a fixed monthly fee.

Audio was ahead of video with such concepts, because audio required much less bandwidth than video – you could get acceptable audio with a 128kbps bandwidth.  Video required at least ten times as much, and as video quality improved, it required more and more bandwidth.

However, the concept of video streaming was experimented with – for example, YouTube launched in December 2005, about the same time as Blu-ray.  But, at the time, YouTube had very limited video quality and was only for people to share their own content, not for movies to be distributed.

Of course, the YouTube quality rapidly improved.  In March 2008, it added the ability to store and serve 480p videos – ie, DVD resolution (although much more compressed, so not as good an overall picture quality), in July 2009, it added 720p video, and in November that year, it added 1080p.

In July 2010 it added the new 4K format, and in October 2014, it added 60 fps capabilities too.

But, we’re getting ahead of ourselves.  Moving back to the mid/late 2000s, the major defining event was perhaps in 2007 when Netflix – by then the major DVD rental service, with its unique twist being it sent you discs in the mail, and you returned them in the mail, without the need to visit a retail store, and with no late fees (instead, you had a limit for the number of videos you could simultaneously be renting) – launched a new video on demand streaming service.

At the time the streaming service launched, Netflix had 1,000 titles available for streaming, compared to about 100,000 titles available on its mail-exchange DVD rental service, but it quickly became apparent that online streaming was where the future would lie – both for Netflix and for many of its customers too.  DVD sales went into immediate decline – not because Blu-ray numbers were growing, but because of Netflix.

Originally, Netflix sold its video streaming as an add-on to their DVD rental programs, and limited the number of hours of streaming you could get each month.  In 2008, they added a plan where for $9 more a month, over and above a DVD rental plan, you could stream unlimited movies online.

In 2009, Netflix was then fielding 12,000 streaming titles, and two years further on (2011) Netflix made the first of several moves to try and move on from DVD mail-exchange, creating a new subscription level, an $8/month for unlimited streaming and no DVD rentals.  This proved to be the point at which Netflix really transformed the industry.

It is worth mentioning there were other video streaming services too in the early days, but they generally either limited the amount of streaming, rather than allowing unlimited streaming, and/or added advertising to the content they streamed, and/or only had a limited amount of content available.  Netflix had unlimited streaming, no advertising, and a broad base of content.

On the other hand, freeing themselves from their legacy DVD mail-exchange rental business has been surprisingly difficult to do (or perhaps not so surprising – as we write here, not everyone in the US has sufficiently fast internet to stream video, even now), but between its DVD mail-exchange and its streaming service, and the subsequent appearance of more and more on-demand streaming competitors, the life has gone out of buying movies on discs of any type, just the same as happened to CD sales compared to music streaming.

Home Video Part 7 :  Better Everything

So, by the early 2010s, we had 1080p type television sets – 720p sets were no longer even being manufactured, except for very small screen sizes.  We had Blu-ray discs, as well as continuing to have DVD discs too.  Video streaming was now capable of streaming at least “DVD quality” and increasingly “Blu-ray quality”.  The video streaming was displacing both Blu-ray and DVD discs (either as rentals or purchases), and the 1080p video resolution standard and associated audio was excellent in all respects and for all purposes.

Had we reached a new plateau of “perfection”?  That was the fear of the hardware manufacturers.  They’d enjoyed a lovely new round of television set sales due to the transition from analog to digital television, but now that was dying down, they needed something new to drive the next round of hardware sales.

They tried to sell 3D video, a concept that thankfully failed to be accepted in the marketplace.

So they decided to sell a new even higher resolution format – what became known as “4K”.  Instead of the 1080×1920 format that had become more or less standard, the new 4K format offered 2160×3840 pixels – twice as many pixels in both dimensions, making for four times more pixels in total.  The first 4K television was released in 2012, and with a substantial extra cost over 1080p televisions.

On the face of it, the 4K resolution is an enormous improvement, but unless you were sitting very close to the screen, you’d not notice any difference because, at even a moderate distance, the pixels tended to blur and merge into each other, just like the way that a printed picture, which is actually a collection of dots of just three or four colors, merges and looks like a solid picture with thousands, even millions of different colors.

However, there was one added new feature that could make 4K much more compelling an upgrade.  As well as offering higher resolution, 4K also offered more colors and more contrast.  The earlier television pictures had what seemed like an almost infinite number of colors (16.8 million), but all those colors boiled down to combinations of just 256 different shades of the three primary colors (red, green and blue).  4K could (not always offered in reality) grow to 1024 different shades of each color, taking the total color palette from 16.8 million to 1.1 billion different colors, and also promised to extend the range of colors – the earlier sets couldn’t display all the colors that we can actually see in real life, the new wide color gamut (WCG) 4K sets include not just more shades, more colors entirely.

There are two new specifications for how well a television can display colors – the DCI P3 standard, and the REC 2020 standard.  The REC 2020 standard is the more demanding of the two.  Sets are usually specified in terms of what percentage of all the colors in these two standards they can display.  Good sets these days will offer 95% or more of the DCI P3 standard, and 80% or more of the REC 2020 standard.

The other part of this is the greater contrast – the ability for sets to show brighter whites, darker blacks, and to have more details in the shadows and highlights.  Instead of dark patches just being solid black with no detail, you could see detail in the shadows, and in the bright highlights too, due to a High(er) Dynamic Range (HDR).

There are at least four different “standards” for adding extra dynamic range to a video.  The main three are HD10, HD10+, and Dolby Vision.  Dolby Vision is the best, HD10 is the least good of the three.  There are some other seldom/never used standards too (eg HLG), but for now, to be sure of being able to get the best picture from all sources, you want to get video gear that supports all three main standards.  Not much gear does, and that which does tends to be more pricy (of course).

These two features – better color and contrast – were clearly visible on any 4K WCG HDR display (assuming it is playing content encoded with these features – a regular DVD, for example, won’t look any better).  But not all displays offered the WCG and HDR, even though all 4K displays offered the 4K resolution.  Nowadays HDR in one form or another is common, but WCG is more elusive.

To start with, 4K televisions were of course massively more expensive than 1080p televisions, but prices quickly dropped, and as early as 2013, the first under-$1000 4K television was released, and the cost differential on larger screened sets started to drop.

Nowadays, the cost differential for a 4K set is essentially non-existent, because lower resolution televisions are no longer made, other than in the smallest screen sizes.

With the new television display capabilities came a need for new content to show on them.  The studios were slow to release 4K material, which slowed the adoption rate, much to the manufacturers’ annoyance.  But slowly movies were either re-mastered in 4K or new movies were shot in 4K, and Netflix started offering 4K streaming in April 2014, albeit at an extra cost per month.

The Blu-ray standard evolved to a 4K variant in 2016, but whereas when DVD evolved to 1080p, it was renamed Blu-ray, this time around, the new standard was simply named “4K Ultra HD Blu-ray” – a bit of a mouthful, but presumably intended to make it easier and more encouraging to upgrade one more time to this new format.  The new 4K Ultra HD Blu-ray players can also play regular Blu-ray and DVD discs too.

4K Blu-ray movies sell for about $5 – $10 more than regular Blu-ray movies, which in turn seem to sell for about $5 – $10 more than DVD versions of the same movie.  Of course, exceptions exist, both higher and lower.

Most new movies on 4K Blu-ray seem to sell for about $30, while older movies may appear for $15 – $25 each.  Sometimes you’ll get sets of multiple movies at much lower costs per each movie included.  Strangely, many 4K Blu-ray disks come complete with a “free” second copy of the movie, in regular Blu-ray format, too.

The Future and its Problems

So have we now reached almost the final peak of video perfection?  Sure, there is still work to be done with more improvements in display picture quality (the gamut and contrast – WCG and HDR elements) but these can all be done, and are being done, within the 4K standard.  This is important – the current 4K standard still has a long way to go before it is optimized.

However, a bit like with cameras, the feature that most people focus on is pixels (even though, and also with cameras, it is no longer the pixels that are the most significant determinant of actual picture quality).  So there is a movement to create still higher resolution/more pixel count televisions.  There have been some hesitant half steps to 5K or 6K, but most of the development work seems to be on 8K resolution (4320 x 7680 pixels).

This is definitely going beyond the point of vanishing returns for most of us.  To put that number into context, if you go and enjoy a movie in a cinema, the chances are it is being projected onto the screen at either about 2K or 4K resolution.  So 8K would be at least four times, and possibly up to 16 times more quality (loosely speaking) than you get in a theatre.

As this table shows, unless you’re unusually – almost uncomfortably – close to the screen, your eyes won’t see any difference in picture detail/quality/sharpness.  The table comes from a useful website – on this page you can plug in the size of your screen to find the distances at which you benefit from different resolutions.  (But, remember, the move from 1080p to 4K isn’t as much about resolution as it is about color and contrast.)

The impracticality of 8K extends to the bandwidth it would require to stream.  4K movies require 15 – 25 Mbps to stream, and could benefit from even more bandwidth (for less compression – a 4K Blu-ray disc plays back with a 50 – 100 Mbps data stream, and that is already compressed).  8K would require 60 – 100 Mbps to stream with similar compression to 4K movies, and could grow to as high as 500 Mbps.  Doubling the frame rate, as is generally proposed, would almost double these numbers again.

But there’s an even bigger problem.  99.9% of all movies and television shows were not filmed in 8K format.  If it was shot on film, and if the image size on each frame is larger than, say, 1.5″ x 2.6″, then the material could be remastered to 8K, but if it is on a smaller format, there just isn’t enough detail on the film to create an 8K scanned image.  Film seems to have about 3000 pixels per inch of data (based on its grain size).  Almost nothing is filmed on film as large as would be needed to generate an 8K digital image.

If the original content was filmed digitally, and that is how more and more movies are filmed these days, whatever resolution it was filmed at is the maximum resolution it can ever be.  You can’t create valid new detail where it doesn’t already exist.

While it is possible to film content at 8K now, and indeed some content is being filmed in 8K, much of the time, the reason for the 8K filming is not to create a final 8K video, but to be able to “re-frame” the shots in the editing booth – to pan and crop around the entire 8K image to select the best 4K part.

One other point of note.  Let’s say there is a movie filmed on huge size film, perhaps one of the modern blockbusters where money was no object.  The chances are the movie also has plenty of special effects that have been generated by computer, and those special effects will probably be at something like 4K resolution, not 8K resolution, so those will look strange if up-sampled to 8K.

So we don’t see a lot of progress towards 8K any time soon.  But the industry appears committed to it, because it will be an excuse to “encourage” you to upgrade one more time.

However, in our own case, the only reason we upgraded to 4K was for the improved color and contrast.  Those factors (which are essentially limited by the LED technology in the screens at present) remain an equal limitation for 8K as 4K.

Sure, with the relentless march of technology, one should never say never, but we will confidently assert “not in this decade”.

The Best 4K Gear

If this has encouraged you to consider upgrading some of your 1080p (or even earlier) gear, what should you look for in 4K gear now?

We’ll answer that question in future parts of this series.  And if you missed the first part of this introduction, you can see that here.

Leave a Reply

Scroll to Top
Scroll to Top

Free Weekly Emailed Newsletter

Usually weekly, since 2001, we publish a roundup of travel and travel related technology developments, and often a feature article too.

You’ll stay up to date with the latest and greatest (and cautioned about the worst) developments.  You’ll get information to help you choose and become a better informed traveler and consumer, how to best use new technologies, and at times, will learn of things that might entertain, amuse, annoy or even outrage you.

We’re very politically incorrect and love to point out the unrebutted hypocrisies and unfairnesses out there.

This is all entirely free (but you’re welcome to voluntarily contribute!), and should you wish to, easy to cancel.

We’re not about to spam you any which way and as you can see, we don’t ask for any information except your email address and how often you want to receive our newsletters.

Newsletter Signup - Welcome!

Thanks for choosing to receive our newsletters.  We hope you’ll enjoy them and become a long-term reader, and maybe on occasion, add comments and thoughts of your own to the newsletters and articles we publish.

We’ll send you a confirmation email some time in the next few days to confirm your email address, and when you reply to that, you’ll then be on the list.

All the very best for now, and welcome to the growing “Travel Insider family”.






David.