Tablets

CES 2011: The biggest winners and losers

The Consumer Electronics Show is the biggest stage of the technology industry and every CES has its winner and losers. Here is the scorecard for CES 2011.

The Consumer Electronics Show is the Super Bowl of the technology industry. As much as industry analysts and the tech press whine about CES being too big and being a relic of a bygone era, there's no better place for tech companies to make a big splash that will be remembered throughout the year, and in some cases for years to come.

It's also the place where tech companies can jockey for a better position in the market by generating more attention for their product line than a competitor's. Conversely, companies that don't make a good showing at CES can risk creating an impression that they are falling behind and risk having their products get lost in the crowd.

As such, every CES has its winners and losers. Here is this year's scoreboard.

WINNERS

Motorola Mobility

Motorola Mobility the new Motorola spinoff that went live as a separate entity on the New York Stock Exchange on Monday of CES week. This new company focuses entirely on consumer devices and their first CES couldn't have been scripted any better. In partnership with Google and Verizon they officially announced the new flagship Android tablet, the Motorola Xoom. They announced one of the first high-powered smartphones running on Verizon's LTE 4G network, the Droid Bionic. And, to top it all off, they unveiled the breakthrough product of CES 2011, the Motorola Atrix, a smartphone that doubles as a legitimate PC replacement with both desktop and laptop docks.

Verizon Wireless

In the past, Verizon Wireless has saved most of its big product announcements for some of the telcom-specific trade shows or its own private announcements. This year, the company came to CES with both guns blazing. On the heels of the successful US launch last month of its 4G LTE network -- which is, arguably, now the world's most advanced 4G Internet experience -- Verizon had LTE live in Las Vegas and ready to show off the world's largest annual crowd of technology enthusiasts. The network performed like a champ. And, on the official opening day of CES, Verizon held press conference announcing 10 LTE devices -- smartphones, tablets, and hotspots -- that will launch during the first half of the year. It was an impressive roster of devices from leading vendors such as Samsung, Motorola, HTC, and LG. The Motorola Xoom tablet and the HTC Thunderbolt were especially impressive. I spent 25 minutes using the Thunderbolt and was blown away by the speeds generated by the combination of its high-end hardware paired with Verizon LTE.

NVIDIA

The company that was seemingly everywhere at CES 2011 was NVIDIA. Specifically, the dual core NVIDIA Tegra2 was the processor du jour, showing up in many of the new smartphones, tablets, and even car-tech announcements. At CES, NVIDIA announced that it is partnering with ARM to launch its own high-performance CPU cores the size of a dime. This is the long-rumored "Project Denver" that NVIDIA has been working on and it will likely position the company as a leading powerhouse in the next generation computing, since ARM processors power nearly all of the smartphones and tablets coming to market and are now scaling up to run PCs and servers as well. NVIDIA looks like it could become the Intel of the next great wave of computing.

HONORABLE MENTION: Samsung

LOSERS

Microsoft

This is beginning to sound like a broken record, but I'm still waiting to hear any hint of Microsoft's vision of where it sees the computing world headed in the decade ahead and how Microsoft plans to take us there. We didn't get it at CES 2011 even though Microsoft once again had the biggest stage -- the opening keynote the night before the official opening of CES. Microsoft squandered its opportunity to set the agenda of the tech industry. Instead, it simply trotted out its standard stump speech on "Windows everywhere" and replayed many of the same demos from its 2010 product launches. Then, once the show opened, many of Microsoft's traditional PC partners spent CES spotlighting their tablets instead of new netbooks or laptops running Windows 7. Microsoft's newly-launched Windows Phone 7 devices got completely overshadowed as partners HTC, Samsung, and LG put their new Android smartphones front-and-center. And, at a show dominated by the launch of new tablets, we heard nothing from Microsoft about its tablet plans. Even Microsoft's long-time partner Intel criticized Microsoft's missing tablet strategy.

Sony

Sony's CES story is not unlike Microsoft's. Where's the leadership? Where's the vision? Sony has traditionally been the biggest name in the consumer electronics industry. It should dominate CES. But, when's the last time you heard about a breakthrough product from Sony? When's the last time you heard about a market-leading product from Sony? Its competitors have been running circles around it, and it was no different at CES 2011. Sony put on some glitzy presentations, but lacked the product leadership. Much of that was due to the fact that Sony focused most of its energy on the wrong thing -- 3D TV. While Sony had perhaps the most impressive 3D television at CES, that's a dubious distinction akin to having the nicest house in a neighborhood scheduled for demolition. Contrast that to what Samsung did at CES. It delivered thin, high-quality, reasonably priced (non-3D) TVs -- the ones people actually want to buy. But, TVs weren't Samsung's only story. They unveiled exciting new smartphones, tablets, and PCs, including one of the most innovative new products at CES, the Sliding PC 7, a hybrid laptop/tablet.

Intel

One of the major trends at CES 2011 was the acceleration of the world to mobile computing. With the unveiling of dual core smartphones, over 80 new tablets, smartphones that also serve as PCs (the Motorola Atrix), and an emerging new category of hybrid tablet/laptops, it's clear that technology companies are rising to meet the demands of users, who are spending more and more of their computing time using mobile devices instead of traditional PCs. If you look closely at this army of new devices -- which were the headliners of CES -- almost none of them are running Intel processors. Practically the whole fleet of these devices are running ARM processors in general and the dual core NVIDIA Tegra2 specifically. Intel has known that this trend was coming. I remember Intel presenting a slide at the 2008 Intel Developer Conference showing how sales of mobile devices were going to rapidly overtake traditional PCs in the years ahead. Intel attempted to meet this trend with its Menlow and Moorestown processors for "Mobile Internet Devices" (MIDs), a category that it tried to create with its hardware and software partners. However, Intel's chips couldn't match the low-power efficiency of ARM chips and rise of smartphones crowded MIDs out of the market. Instead of going back to the drawing board and making a more power-efficient class of mobile chips, Intel continues to fall behind. While NVIDIA chips were powering nearly all of the tablets and high-end smartphones at CES, Intel played up its "Sandy Bridge" chips for multimedia PCs. This focus on high-end chips feels like a tacit admission that Intel has no answer in mobile.

About

Jason Hiner is the Global Editor in Chief of TechRepublic and Global Long Form Editor of ZDNet. He is an award-winning journalist who writes about the people, products, and ideas that are revolutionizing the ways we live and work in the 21st century.

69 comments
azasadny
azasadny

You left off Microsoft's decision to pull DE (drive extender) from Windows Home Server, which effectively kills it... What a waste!

PhilippeV
PhilippeV

Winners and losers are too much concentrating on US mobile network providers. People actually don't want to be linked to their providers, and in fact they watn more universal products that allow them to switch and connect from any place of their choice. I'd really like to see announcements, so that all mobile products owned by someone create a personal network around the user, allowing all kind of computing solutions to interact. This should include media centers, the GPS device in you car, your desktop or laptop PC at home and your tablet which is unusable in a car, and your smartphone in your pocket or lady bag. The real announcement that people are waiting is more integration. Then allow them to use the device format they feel is best for every kinf of situation. Unfortunately, all those devices interact very poorly between each other, because they severely lack connectivity AND software solution for integration. If only they could just any wireless network they can find in their neighborhood and that we could associate. And if only they could synchronize each other (or with a remote Internet storage) easily. All those solutions are still to much centered on specific providers, or on the device itself. People want user-centered solutions. So what is really important : * excellent connectivity : all wireless networks (3G, Wimax, BlueTooth, WiFi, wireless USB) plus a standard SD card reader and at least one USB port (working in dual mode: slave or master) * a choice of software solutions : not just a browser or applications specific for a mobile Os, but also support of a virtualization machine (Java, .Net) where applications can be transparently deployed, a choice of synchronization meachanisms, and a remote desktop system to interact with more powerful (remote or nearby) computing systems. * media center capabilities (both in server mode, or client mode) * autodiscovery mechanisms * offline computing capabilities. * management of all storages from any device, without needing to place them in a specific mode that makes them unusable. I'd also would like to be able to use a mouse and a keyboard with my smartphone: bluetooth or WirelessUSB should do the trick. Possibility to use a larger screen for my mobile appliance : we do need a "Wireless VGA" or "Wireless HDMI" standard, and hardware support for it (i.e. hardware-supported audio/video codecs, compatible with MPEG and independant from proprietary technologies like Flash, with an integration that make them as easy to use as when plugging in a monitor for a desktop PC). I'd also like to see now a standard for the PV/HiFi remote controls. All those functions should be described in terms of service description, device discovery, and installable applications that can be used from any device or PC, through the local wireless network. Let's stop those remote controls, we should be able to use our smartphone or tablet as easily, with their touch screen and a simpler interface than the complex buttons. Users should also be able to redesign their interface, so all controls on the touch screens should be tweakable to fit what they want. When will remote controls start using standard protocols, Bluetooth, wireless USB or WiFi? I'm also fed up of those multiple remote controls (the so called "universal remote controls" actually do not work or lack most of the features supported by the devices we want to control with them). EVERY DEVICE in a personal network, should be able to interact with another device in that same personal network, according to their complete exposed capabilities. It does not matter if this is a PC monitor, or a high-end flat TV, or the display of a smarthpone. And the mobile phone operators MUST also stop their attempt to take control of our personal computing or media environment. We want neutrality and do'nt want them to restrict us to use only the smartphone or pay additional costs just because we plugged a USB cable to interconnect a PC. I am also fed up of their attempt to restrict the protocols, or force us to use their costly 3G network for basic interaction. People are fed up of all those restrictions (impossible to understand or even manage in our billings) that are plagging their mobile networks, or by the commercial arrangements that force us to use only their service in so many situations.

RLiang
RLiang

Bang on Jason! In an age when everybody's suppose to have an iPad killer, where's the leadership and technology from not just Microsoft but Dell and HP? Check out their websites and there's no mention of tablets. I think Windows Phone 7 is missing the mark. The ads look to be targeting business users, yet given how far behind RIM and Apple they are, Microsoft should be giving away seed units to all their Select and Enterprise customers. If they are targeting consumers with Phone 7, the ads don't convey that at all.

techotter
techotter

I'm a little surprised that there wasn't a greater outcry about Intel's Sandy Bridge chips having the capacity to be shut off from a remote location. I'm wondering at what point in the future will a journalist have his/her laptop shutdown during an important project, discover it's because their processor was shut down remotely, and then harp about this "new" invasion of their rights and privacy. I guess glitz beats substance at CES... at least when it comes to Intel.

lwalden@ebmud.com
lwalden@ebmud.com

Did you see the Notion Ink ADAM tablet? It has everything the future Xoom will have, today.

BillGates_z
BillGates_z

"it?s clear that technology companies are rising to meet the demands of users, who are spending more and more of their computing time using mobile devices instead of traditional PCs" What's clear is that this is where companies see huge profits. Can't charge twice for a PC but a $125 (and going up) monthly mobile service fee goes on forever>

yanyo
yanyo

I think BlackBerry could have a mention in Losers too.

Adam in DC
Adam in DC

Another issue with Microsoft is it's confusing direction in the Home Server Market. MSFT and HP have a small but vocal following with the HP Media Smart Server line, as well as some other manufacturers and DIYers. More and More consumers need a simple, centralized place to store their documents and media - so that place can easily be backed up to the cloud - rather than have things distributed on multiple PCs in a home. WHS is one of the best products for that and MSFT is practically abandoning it.

willyram
willyram

Good overall. I would have mentioned Google with Android as winner also.

mathan_gm
mathan_gm

I think Sony in CES is not about 3D alone. They have been speaking a lot about content delivery. Sony Internet TV, Sony's new HomeShare for example wil let you stream music all over your home. Music Unlimited 'Qriocity' learns what music you like and let's you access from most of Sony connected devices. Sony Ericsson Xperia arc that will use Android Gingerbread 2.3 platform and have Mobile BRAVIA engine. And 3Ding everthing is an added advantage

tektok
tektok

Couldnt agree more. But you forgot Panasonic in the list of losers. Perhaps just above Microsoft. Jamie tektok.ca

panicinwi
panicinwi

I don't know how you can add verizon as a winner and not include AT&T? Verizon anounced 10 4g devices while AT&T anounced 20. AT&T's network is already better than verizon and with the way they are updating their network they will have much better fall back speeds than any of the other carriers.

cant_drive_55
cant_drive_55

Good analysis, Jason. I think you are right on target. Two things come to mind: 1- 4G and LTE is pretty much ubiquitous in urban, populated areas, but there are huge swaths in the US where there is little or no cell coverage. This needs to be addressed and is something that the federal government could very much assist with by clearing the way for wireless companies to install enough towers to provide near-total coverage. 2- Where will this take the traditional desktop computer sector in the next several years? I know that the tendancy is to really push desktop capacity by having large numbers of programs and windows open at one time with multiple monitors, but there is also the desire to carry it all with you. Will we start seeing a standard integrated into the O/S, of a WWAN (world-wide area network) where there is a Dropbox-style extension of the LAN to many devices no matter where they are? (Dropbox already has a feature where two devices in the same LAN do not have to sync via the cloud.) That is where Microsoft could provide leadership, but they will want a HUGE slice of the action, no doubt...

vucliriel
vucliriel

... And it's high time the tech press talked about it!!!

MrRich
MrRich

Qriocity? Kind of too late to reinvent Pandora. That horse is already out of the barn... Sony are a great company, but becoming a content producer left them with a huge internal conflict that has prevented them from marketing music sharing devices. They will still be trying to sell DRM after it has died a horrible death.

jmarkovic32
jmarkovic32

Sony is full of nothing but useless gimmicks like Blu-Ray and 3D. All they seem to be delivering are diminishing returns. How does seeing a pulsating zit on someone's face in 3D going to improve the quality of my entertainment experience?

Hazydave
Hazydave

Verizon is already hot with real (almost) 4G technology, LTE at 700Mhz. They have 3G on every cell. AT&T just finished their last 3G uopdate, HSPA+. They might call that 4G (probably because T-Mobile did), but its not, and bever will be. And they only have som kind of 3G (HSPa or HSPA+) on less than 25% of their cells. And they won't even be rolling out real 4G (also LTE, also at 700Mhz) until sometime this summer, or later. AT&T is way behind Verizon... even behind Sprint on 3G support and actual 4G technology. People are goingto be confused by AT&T, too. How many of their "4G" offerings are really just 3G/HSPA+, and wich, if any, wiill actually support LTE later this year? Looks like they're setting themselves up for even more customer frustration. Add to that the fact Verizo still offers unllimited data plans for cellphones, and its no wonder folks have been leaving AT&T in droves. And that's even before the Verizon iPhone....

wild__bill
wild__bill

AT&T's network is not even in the same league with Verizon's, ask any AT&T owner about their dropped calls and legendary waits for data in peak times & cities - get off the crack pipe! PS - they don't even have an answer for Verizon's LTE which is deployed now, until the end of this year, and it will be, tada, LTE, about a year late!

MrRich
MrRich

Its clear that the boundaries to where you work are breaking down. But nothing beats a big display (or two) for efficiency. Its no longer about the CPU in the device, its about the screen real estate...

kales
kales

We are not built to compute, au contraire, we want to communicate: idea's, plans, visions,emotions with other people and in the process create something new.. We are the masters of re-using, if somebody has already done the math, we'll copy it. So: text, numbers, pictures, video's, music is what it is about, to be entertained, or educated for a new job, preferrably in a social environment.

bobbias
bobbias

Thanks for making the point about coverage. As a Canadian, I know very well how bad it can be when there are large areas with little to no coverage at all. Sadly, for your second point, I just don't see something like that happening very much in the near future. To create a complete shift like that, requires a lot of coordination between different companies, and unless the companies themselves come to the same conclusions that led you to want some system like that, I can't see it happening for quite some time. What I can see is more integration of cloud computing into portable devices. I could even see something where the device itself is only really a thin client for a cloud computing system. But heavy integration with the home system? I dunno, I think the solutions will be more cloud-heavy in the near future before they finally figure out a more direct way of doing things.

thoiness
thoiness

HD has enhanced my viewing experience quite a bit. I don't know about "pulsating zits," but an air of realism to my entertainment is definitely not a bad thing. Now I'm not saying that Blu-Ray is the future of entertainment, as I don't have a Blu-Ray title in my inventory. But I strongly believe HD delivered via internet media is going to be the future of entertainment. Especially in a market where the cable companies have priced themselves out of the middle class and show no intentions of curbing their rate hikes.

NickNielsen
NickNielsen

Leave metro Atlanta. Don't take the Interstate or any other 4-lane road. "Can you hear me now?" No. There's a whole big world out there, particularly in rural Georgia and South Carolina, where AT&T's coverage is [u]much[/u] better than Verizon's, even though the network may be slower. Heck, there are several places (including a rest stop or two) on I-20 between Atlanta and Columbia where I can't get Verizon, but AT&T is loud and clear.

MrRich
MrRich

Someone want to enlighten me about why Verizon coming out with 4G is a big deal? We've had it on Sprint since last spring. It is good that Verizon are promoting a phone at the same time, because the devices are lacking as of yet. IMHO what AT&T and Verizon don't offer is decent business support. It's more expensive and doesn't offer the same support I get today. (Maybe I am the only happy Sprint customer, I dunno...)

vucliriel
vucliriel

At the risk of sounding like a broken record... Can anyone explain why, with cameras boasting double digit megapixels for years now, we are still made to endure tunnel-vision-like, sub 4MP computer monitors? Why aren't we seeing 10MP and more computer monitors? We've been stuck on low res for over a decade now! Seriously, folks, we've been stagnating for the past decade, as any improvements have been made strictly to handle the incessantly increasing aggravation of User Control. Simply looking at how much software has become dysfunctional should be enough to convince anyone...

RipVan
RipVan

HD makes some movies look like TV. And some TV looks like bad plays. I like it, though. But my brothers and I all look at each other at times when those moments come up. We will have to come up with a name for that. I'm not creative enough to come up with one, but hopefully someone will. I wonder if HD will somehow change acting or the recording of content. It really could use "something." I just don't know what...

SkyNET32
SkyNET32

I'm not interested in watching HD content for say, a network comedy show; that's useless. HD shines in CGI in movies so whether they are delivered via a cable company or the Interwebs, as long as its done right, that's fine. I don't need to watch todays tv shows (needless to say the content on todays cable shows is, for the most part horrible, but we'll save that argument for another day XD) in 1080p, so for those folks who whine about Hulu not being full 1080p, big deal, unless its a sci-fi or nature show? Anyways.......

Hazydave
Hazydave

I have driven up and down the Central to North Atlantic Coast, listening to Pandora on my Verizon Droid, and never glitched. My sister, on the same roads, drops on AT&T about half the time shw calls. Ok, sure, she has an iPhone, which contributes, but the network just isn't that solid.

RipVan
RipVan

The only reason I went to TMobile was for the MyTouch. I wanted a smart phone and did not want to become an iSheep. I am in the Cincinnati area and coverage was not as good as Verizon, but it didn't bother me. What did bother me was that I later discovered that my daughter, who was at school north of Columbus really only got service from Verizon. It isn't just TMobile, go east from there on any network and you will find dead spots. But ONLY Verizon services her campus well. I didn't really believe her and I went up there to see. I found that I got a minor signal outdoors (dropped completely in various place around town) and absolutely no signal indoors. So come August, and the end of my contract, NO MORE TMOBILE. Now everyone has the droids I'm looking for.

NickNielsen
NickNielsen

There are several locations on I-20 between Atlanta and Columbia, SC, where, for me, Verizon drops; I can give you mile markers. There's a location on US 76/301 between Florence and Marion where Verizon drops. There are whole swathes of countryside outside diverse places like Statesboro, GA, Swainsboro, GA, Sandersville, GA, Aiken, SC, Bamburg, SC, and Barnwell, SC where Verizon doesn't even go. Granted, AT&T doesn't have full coverage in many of those areas, but my experience is they have much better coverage there than Verizon. Of course, I'm using AT&T with my seven-year-old Samsung penny phone (that still says Cingular), and calling Verizon on a three-year-old Motorola V-750. Yeah, that could be it... Added: All I'm trying to do is make telephone calls. I don't even fire up the Verizon air card unless I see a city limits sign, because I know it won't connect out there in the country.

SkyNET32
SkyNET32

Sprint or T-Mobile's network, but I have Verizon, and my wife has an iPhone on AT&T, and she swears she's happy with their coverage for voice calls, but complains sometimes that downloading over 3G is lame. I can't really comment on Verizon since my phone is "old" (HTC Ozone running Windows Mobile) so I'm more angry that the phone locks up and is non responsive to my input more than Verizon's network. I do notice at my house I get very little bars, but my wife is only slightly better. Soon as I get to work an hour away my bars are full tilt. Location, location, location..... :D

kevster25
kevster25

I have to agree with the quality of the coverage with Verizon versus AT&T. Here in California they are far better in call quality (number of drops) and coverage than AT&T. I travel to some pretty rural areas and I'm often surprised at how well Verizon's coverage is on my work droid phone. I can't wait to get my iphone on VZ!

320vu50
320vu50

I recently worked on the wireless location deployment for a county the size of Connecticuit and Rhode Island combined. The hands down best coverage was Verizon. I also, have made cross country trips in the last few years and the coverage I had on my Sprint phone was excellent. Except in the very rural areas and places with mountain terrain. The very best coverage was in the Eastern states and the West coast region. But, in the rural middle the coverage (even on roam) had large dead spots. One thing to note is that T-Mobile has not been sued over it's adds related to the comparison of the two networks. And they don't seem to be included in the discussion here.

john3347
john3347

I am a Verizon customer (switched about 4 or 5 years ago) because AT&T frequently dropped my calls in downtown Atlanta, - forget suburban Atlanta. I frequently travel through Atlanta 'til today, in different directions, and I don't remember ever having lost a call with Verizon or ever having failed to secure a connection on my Verizon USB aircard, anywhere in the state of Georgia, - even while moving. (Perhaps personal hardware has as much to do with reception as provider infrastructure)

Hazydave
Hazydave

LTE is a significant improvement over Sprint's WiMax. Not just in speed.. yeah, its faster per MHz of bandwidh, but the end user only sees the network imposed speed caps anyway, not the protocol caps. LTE is improved because they moved away from OFDM moduation on the uplink. What that means: rather than using more power on 4G links as WiMax does, 4G will actually take lower power. The other improvement is the RF band. Verizon and AT&T have always worked a little better than Sprint and T-Mobile in buildings or rural areas, not due to any protocol differences, but he fact that they had 850Mhz channels, in addition to the 1900MHz also used by Sprint and TMo. Sprint is on 2500Mhz for WiMax (Clear and Comcast, too, they are all partners in the same WiMax network).. lost of bandwidth, but more free air path loss, more loss through structures, way more loss through foliage. Anyone who's set up a long rang WiFi network knows what to expect. Verizon is on 700Mhz with LTE... much better range, much better penetration. AT&T will also be using LTE on 700Mhz, sometime late tbhis year. But Verizon is seen as an early winner, largely because they haven't had the financial problems of Sprint or the upgrade expenses of AT&T prior to the 4G rollout. Verizon had full 3G coverage, every cell in the network, years ago... AT&T iisn't there yet. So Verizon can be expected to make good on their claim to have full 4G coverage in two years... every cell. No one else even m/akes that claim. And, at least until Sprint 2G reaches my house (rural South Jersey), I wouldn't personally care.. full 4G coverage only matters if "full" includes where you actually need it.

Spitfire_Sysop
Spitfire_Sysop

I too am a happy Sprint customer. They are often leaders in back bone infrastructure. Verizon eventually gets more towers in more places but because they use the same technology and roaming is free the usable networks are identical. Verizon customers use Sprint towers and vice versa. So the reasons to pick one or the other aren't "the network" like Verizon loves to claim. Especially when Sprint did in fact have the 4G first.

HAL 9000
HAL 9000

But it is Reflected from the Screen that it is Projected onto. We could now have a long detailed debate on the benefits of one screen type over another and which was best. Even for that matter we could go into long detail as to which format was better 35 mm, Medium Format 2.25 or Large Format Graphic Arts. I've used all forms in both Positive & Negative Film and as to your comments about the best film it all depends on what you are taking photos of. I've used lots of Ektachrome mostly in Medium Format but then I was taking photos of Plants. For motor racing I've found that Fuji made the best high speed film that produced good colour and little graininess which is always a problem when you start taking in the very high speed ranges. For that matter even the best format of the camera all depends on what it is you are doing. Blads get very cumbersome particularly the Motor Drive units for lots of Hand Held work and Graphic Arts is even worse though you'll never see anyone with that type of format at Motor Racing thankfully. ;) Even the humble 35 mm with a 1200 mm Telephoto can get hard to deal with over long periods of time but for work like that it's easier to use hand held. Yes NG always asked for Positives but not because they produced a better print but because it suited their Printing process better. The Printed Photos in NG while looking good where a very poor reproduction of the original transparencies but for a Bulk Print Magazine they where very good. I still defy anyone to produce a better photo with Digital as even the best high end H4 Type camera is still only 60 Mpixel where as the Film is far more detailed and the grains so much smaller. We have yet to reproduce the resolutions achievable in a Film format with Digital. But as so many places accept Digital Images it's not because they are better but because they are easier to work with and far cheaper to use. Hardly a sign that Digital Images are High Quality which they are not just lots cheaper to work with. The 500ELM Blad that I have here still produces much better images than the H4D=60 that the Blad Agents here insist on lending to me in the hope that I'll buy one. Col

NickNielsen
NickNielsen

without specifying which year you were 'yestering'; I chose to assume a year that made my point. You don't have to tell me about transparencies; I've developed more than a few rolls of Ektachrome 100.

vucliriel
vucliriel

1) projection imaging does predate printing. It is also the only way to view all the information contained in the photographic film, because light is TRANSMITTED through the medium (NOT REFLECTED, HAL9000). Reflection is the principle behind paper printing To give you an idea that corresponds to something you may understand: paper is to projection viewing what reflective LCD is to backlit LCD. Surely I don't need to elaborate? 2) Printing IS a relic from yesteryear, when images could only be produced on some sort of REFLECTIVE support, such as paper. 3) 4x6 printing is going to continue as long as there is no cheap and effective alternative. Because to see the DETAIL available on a 4x6 print at 600 DPI you would need a 4x6 screen (at 300 PPI, which arguably corresponds to 600 DPI, this corresponds to 1200x1600 resolution). I haven't seen any yet, and photoframes still have pitifully dismal image resolution at this time. Now transpose that to the 8x12 standard enlargement format, which would be suitable for photoframe use and corresponds to laptop screen size, and you need 2400x3600. Where at those screens? WHAT ARE THE MANUFACTURERS WAITING FOR?

vucliriel
vucliriel

You are referencing old history, not recent history at the height of magazine publication. Are you familiar with pre- Digital Photography at all? Does Slide Film ring a bell? The National Geographic magazine, one of the most respected of all publications, with consistently the world's best photography, specified TRANSPARENCY film exclusively. And to view this medium, you PROJECTED IT ON SCREEN. I have been in photography for over 40 years, and MOST of my work (nature and architectural) was done on slide film (35mmm and 2 1/4), to be viewed on a projection screen EXCLUSIVELY. Prints produced for display purposes are invariably inferior in dynamic range and color fidelity to transparencies, even using the best positive process of the time, the Ilford Cibachrome process. The best color images were produced using Kodachrome 25 transparency film. This process, approximately 100 years old produced the most finely detailed color pictures of any color film. Also, using Ektachrome (another transparency film) and the Zone System, one could produce images producing a dynamic range close to what the eye can see. Try that with a digital camera using a single exposure! In Black and White, only the Kodak Technical Pan, exposed and developed using the Zone System, in conjunction with the Agfa Brovira paper, could come close to reproducing the range and detail close enough to the range visible by the human eye. But the fact of the matter remains that the reflective medium, paper, simply cannot reproduce the dynamic range and color possible with projection imaging. And that is not taking into account the image degradation introduced by these extra processing steps! Bottom line, there is simply no substitution for the projected image. Ergo, the need for imaging devices that can reproduce images as they were captured in camera, without the limitations and inevitable degradation of the printing process.

HAL 9000
HAL 9000

Shops are advertising 15 cent Prints when you bring in your images on Thumb Drives. :D Even the ones who burn to CD will not add a second session and they fail to tell the people that the CD can only last 5 years maximum. I can see lots of people loosing their images and being very peeved off when they find out. The day that the Photo Print is long gone has yet to arrive. And by the way that these Mini Printers are working overtime it's going to be a long time before that happens. ;) Col

NickNielsen
NickNielsen

Projection screens predate the print by centuries.

HAL 9000
HAL 9000

Projection Screens which don't have any resolution by themselves. ;) They just reflect what is projected onto them. Col

NickNielsen
NickNielsen

[i]History proves that most published images started as transparencies, which were viewed ON SCREEN, at full resolution, prior to publication. [/i] On screen? If I remember my history correctly, photography became economical in the 1840s, but it was [u]monochrome[/u] only and the images were printed on metal or glass. Color photography wasn't perfected until the 1890s, and the screens to which you refer wouldn't exist for at least another century. http://en.wikipedia.org/wiki/Photography#History http://en.wikipedia.org/wiki/Color_photography#History

vucliriel
vucliriel

The formula describing the ideal monitor at 20 inch resolution (second from last paragraph) should have been read (2*TAN(60/2)x20, NOT 2*COS, which is about 23 inches, NOT 35 inches. In other words, normal vision would benefit from a 23 inch screen for a laptop working distance of 20 inches. Another error is in the calculation of the total resolution. One minute of arc over a 60 degree angle of vision is 3600 pixels diagonally. If we assume that angle of vision is circular, this equates as (3600/2)^2*PI, or about 10MP, instead of (3,600)^2 which was the 13MP previously calculated. In other words, IBM had it right with its 20.1 inch, 9MP T221...

vucliriel
vucliriel

You make a good economic arguments on the reasons of this state of affairs. Economic reasons are based on demand and consumer demand is obviously driven by movies. Therefore, we are stuck with movie format screens and resolutions. Hmmm... However, your argument that we're 'supposed' to print stills is totally preposterous. History proves that most published images started as transparencies, which were viewed ON SCREEN, at full resolution, prior to publication. The fact is that images viewed on screen are vastly superior in luminosity, dynamic range and fidelity and much, much closer to what the eye sees than the feeble reproduction on dead trees. The print is a relic from yesteryear and a medium best suited for artist endeavours, not for realistic representation of reality.

vucliriel
vucliriel

Higher resolution monitors allows you to view high resolution images in full screen, with NO SCROLLING. Lower resolution monitors sacrifices the camera's resolution. For a photographer, this is unacceptable. "Why do we need higher resolution monitors" is like asking "why do we need IMAX movies". If you have ever seen high resolution images, you would instantly know what I mean. There is simply no comparison, because high definition corresponds to what the human eye can see. Anything less is a deliberate loss. Ansel Adams took his great photographs with a camera sporting an 8"x10" sensor, capable of roughly 40000x50000 pixels or 2 GIGApixels, and it truly SHOWS in the resulting image quality. And that was almost a century ago! Saying we don't need this resolution is like saying that computers don't need more than 640K of memory, or that computers don't really need monitors (like the Altair). The argument that we need more resolution for pictures because of processing is like saying we don't need stereo or high fidelity in sound because we can still get the feel and meaning of the sound on cheap speakers. That is preposterous, of course. It's like saying there is no difference between a recording a live performance. Same thing in imaging. High resolution is exactly like high fidelity, it is ALWAYS better. On a practical level, larger monitors make it easier to multi-task. You can work on a document on the right side while you look at a picture on the left side. Of course, you could say that the same could be done with multiple monitors, but never be able to see the pictures in all their glory, nor the document in its true life-like representation. In this day and age, when the technology has existed for already 10 years (the 9MP IBM T221 is already that old), there is no excuse for not being able to produce high resolution monitors, and for continuing to produce documents on dead trees. Remember, the normal human eye can resolve 1 arc minute of detail. At 20 inches (500mm), that detail is about .145mm, or 175 PPI, which translates to about 1920 vertical pixels for a standard 11 inch tall piece of paper viewed at that distance. At a ratio of 16/10, at that distance the normal human eye could actually benefit from a 21 inch diagonal monitor running at 2560x1920, and THAT is what laptop monitors should be running at! The other thing we forget, hence my term "tunnel vision" in my last post, is that the human eye has a normal field width of 60 degrees, which means a normal imaging capacity of 60 minutes of arc times 60 degrees, or 3600 pixels diagonally or roughly 13 MP. This means a normal human eye could greatly benefit from viewing images, at a distance of 20 inches on a (2*cos(60/2)*20) [~35 inch diagonal] monitor with 13MP resolution! That's way off the puny 2.3MP at 17 inch monitors that is the high end on laptop monitors and still far off the 30 inch 5MP presently available at the upper end of computer monitors at this time... Bottom line, the facts known about normal human vision clearly prove we would greatly benefit from higher resolution monitors. The reasons WHY we don't have them 10 years after the technology was first developed is anybody's guess.

Hazydave
Hazydave

Still camera imagers are very cheap, monitors not so much. The economies of scale on monitors follow consumer video, at least to start with. In the early days, computer monitors were precisely PAL and NTSC monitors. They evolved upward from there. Today, consumer video is nominally 2K (1920x1080, close enough to 2000x1000). There is already a move in cinematic video to 4K (nominally 4000x2000). You can get 4K monitors today, at a professional market price. If you want consumer prices, you'll have to wait some years, for consumer 4K to start pushing out today's 2K/HDTV stuff. As for stlls.. you're supposed to make prints from your 20Mpixel DSLR, not view it on a television. Some people.. yeesh!

calhoun.andrew
calhoun.andrew

I hope someone answers this, because I am curious as to how the conversation will go. If there are technical reasons we don't have 10MP computer monitors (cards can't handle them, too hard to get 10 million pixels on one screen without too many dead/stuck pixels, etc) I hope someone who knows will pipe up and say why. On the other hand, vucliriel didn't really make any sort of case as to why we *need* monitors larger than what we currently have. Cameras (tools for capturing an image in the first place) need to have higher resolution than the eventual display device, because any editing done will benefit from starting with a larger image, whether it be cropping or image compression. This is a good argument if the only source for what is displayed on a monitor is a camera, and that is not the case. Most of what hits my monitor is text, and I can fit plenty of that on my monitor already. But then there is also rendered-on-the-fly images for things like games, CAD software, and other 3-d environments. I am very happy that powerful graphics cards have somewhat recently come down in price as to be affordable by normal people. The last thing I want is for the monitor standards to skyrocket in resolution so that graphics cards that can drive those monitors go right back up in price to astronomical levels.

Editor's Picks