Monday, September 27, 2010

RIM tosses its hat into the tablet market with the PlayBook

At the BlackBerry Developer Conference in San Francisco, RIM announced and demonstrated its new tablet, the PlayBook. Based on the QNX operating system that RIM recently acquired and a new Software Development Kit called WebWorks, the PlayBook will compete directly with the iPad, but no prices or availability date (other than early 2011) were announced.

The PlayBook will have a 7" 1024 x 600 display with capacitive touch-sensing, a dual-core 1GHz ARM Cortex A9-based processor, 1GB of RAM and dual cameras (3 megapixels front-facing with videoconferencing support, and 5 megapixels on the back of the tablet.) The company didn't state the amount of flash memory that will be available, but prototypes shown at the conference were marked "16GB" and "32GB".

RIM claims that the PlayBook will support OpenGL graphics, POSIX and HTML5, as well as Flash Player 10.1 and Adobe AIR applications. Engadget says that the user interface looks a bit like "a mashup of HP/Palm's WebOS and BlackBerry OS." They also got a chance to see the tablet in action (albeit in a static demonstration behind thick panes of plexiglas.)

The question for RIM is what the tablet market is going to look like in early 2011, and how well the PlayBook will slot into the market at that time. Android 3.0 (Gingerbread) should be released in early 2011 and will unleash the flood of cheap Android tablets that we've been waiting for all year. January is a big month for Apple announcements, and we're likely to see the next-generation iPad, almost certainly with cameras, at that time. From what was said today, the PlayBook is not so much forward-looking as it is a response to things that the current iPad can't do. It would be nice to see RIM advance the state of the art, but unless there's a lot to the PlayBook or WebWorks that hasn't been seen yet, today's announcement wasn't encouraging.
Enhanced by Zemanta

Who's afraid of the big, fat pipe?

Cable, satellite and IPTV services all do the same thing: They distribute video content to set-top boxes in consumers' homes, where the content is watched on televisions. If you suggested to them on the record that they get rid of the content, get rid of the set-top boxes and simply allow consumers to get content from wherever they want, their heads would explode like they did in David Cronenberg's "Scanners".

Last week, however, Ivan Seidenberg, the chairman of Verizon, suggested that the future of video is over-the-top content at a Goldman Sachs conference in New York last week, and that a transition from Verizon selling the content to consumers getting content from their own sources is inevitable. Service providers would offer very fast Internet service (100Mbps or more--some Asian service providers are offering 1Gbps), thanks in part to not having to reserve so much bandwidth for video, and would act as common carriers--their pipes would carry just about anything. The service providers would make money on the connectivity, not the content. Set-top boxes and a whole lot of infrastructure and truck rolls would go away.

One enormous thing that goes away in the common carrier model is the need for service providers to negotiate for and license content. Most cable operators would tell you privately that they'd rather have a colonoscopy without sedatives than negotiate with Disney for retransmission rights. Disney's ESPN is the "900-pound gorilla" of cable channels, and Disney uses it like a hammer to get concessions from service providers, from paying to retransmit their local ABC owned-and-operated stations to carrying all of Disney's sports and children's networks. And Disney is only one player: There's CBS, NBC Universal, News Corporation/Fox, Time Warner and many others, all demanding their own share of operator revenues and priority positions in bundles.

There are only two service providers in the U.S. with their own large stables of content: Comcast, which owns E! Entertainment, Versus, The Golf Channel and G4, along with regional cable networks across the country, and Cablevision, whose Rainbow Media subsidiary owns AMC, IFC, The Sundance Channel ad WeTV. (Since 2008, Time Warner Cable has been independent of Time Warner, which owns HBO, TNT, CNN, HLN, TCM and many other cable networks, as well as Warner Brothers.) Comcast, of course, is trying to get government approval to acquire NBC Universal, which will give it NBC, Telemundo, USA Network, MSNBC, CNBC, Bravo, Lifetime, SyFy, The Weather Channel, Mun2 and other wholly- and partially-owned cable networks, as well as Universal Studios.

Comcast and Cablevision can make money by licensing their content to other service providers. They sell themselves the content they need for their own cable systems through what's called "transfer pricing"--essentially, the money goes from one pocket to another. Thus, they have lower content acquisition costs and more control over their future outlook than other service providers.

But what if some of the larger service providers decide to go the common carrier route, or pursue a hybrid strategy of offering only local broadcast stations and a small number of national cable networks, with subscribers free to get anything else they want from anyone they want? To date, no one in the U.S. has had that option (legally), but it could happen. If it did, there would be a lot of cable and IPTV service provider executives who would sleep much better at night.
Enhanced by Zemanta

Friday, September 24, 2010

AngelGate: Send in the clowns

If you haven't been following the increasingly comical affair being called AngelGate, I'll run it down for you. First, a few definitions:
  • A venture capitalist (VC) is an individual or firm who invests in start-ups and small private companies with the hope of selling their stock for a large profit, either on the public market or to a larger company that acquires a company that they've invested in.
  • Angel investors are individual venture capitalists who invest their own money in start-ups, usually very early in the companies' lives (typically seed or first rounds).
  • Superangels are individuals or small groups of investors that invest in a larger number of start-ups than an individual angel would normally invest in, but still focus on very early rounds.
  • Conventional Venture Capital firms invest in a lot of different companies at many stages of development. They have a lot more money to invest than the angels or superangels, and usually invest in later rounds.
Last Tuesday, Mike Arrington, the publisher of TechCrunch, learned about a secret meeting of superangels being held at a restaurant in San Francisco called Bin 38. He was told that he wouldn't be welcome at the meeting if he attended, but he went anyway, and in a private meeting room, he found "ten or so" of the largest superangel investors in Silicon Valley. They basically went into a silent stupor when they realized that he was in the room, and he left shortly afterward. Later, "sources", which Arrington says were three people who attended the meeting, told him that the following subjects were discussed (I'm going to quote directly from the TechCrunch article:):
  • Complaints about Y Combinator’s growing power, and how to counteract competitiveness in Y Combinator deals
  • Complaints about rising deal valuations and they can act as a group to reduce those valuations
  • How the group can act together to keep traditional venture capitalists out of deals entirely
  • How the group can act together to keep out new angel investors invading the market and driving up valuations.
  • More mundane things, like agreeing as a group not to accept convertible notes in deals (an entrepreneur-friendly type of deal).
  • One source has also said that there is a wiki of some sort that the group has that explicitly talks about how the group should act as one to keep deal valuations down.
Venture capitalists, whether large firms, angels or superangels, are competitors. They do work together at times on deals, but generally, they compete with each other to make investments. There are U.S. Federal laws that deal with collusion between competitors to fix prices, terms and conditions, and to keep out or limit the activities of other competitors, and they have nothing to do with monopolies or market share. The very act of competitors conspiring secretly as Arrington says they did could be interpreted as illegal.

The best thing for the participants to do would have been to say nothing and refuse to comment if asked by the press, but that's not what happened. The day after the TechCrunch article was posted, Dave McClure, a superangel, claimed that Arringon's charges were a "bullshit superangel consipracy theory", admitted that he attended the meeting, gave his take on what was discussed, and then finished his screed with this line (and this is a direct quote: "(sic)i'm here to Disrupt, motherfucker. (sic)so go right ahead & Hate On Me."

Yesterday, Ron Conway, founder of the Silicon Valley Angels and one of the earliest angel investors, wrote a long email to attendees of the Bin 38 meeting to say that:
  • He didn't attend either meeting (apparently there were two meetings), although one of his partners did
  • He didn't agree with the agenda or process of the meetings
  • He'd really appreciate it if the other superangels not talk to him again
  • His only interest is the entrepreneurs that he funds
  • And by the way, Dave McClure, don't write or say anything about this email
One of the most endearing things about Dave McClure is that he's incapable of keeping his mouth shut, so almost immediately after Conway sent his email, McClure demonstrated his mastery of Twitter by sending the entire world a message that he had intended to send only to another attendee of the dinners. In it, he confirmed that there were two meetings, said that he and other attendees were being "thrown under the bus" by Conway, and confirmed the identity of another person, David Lee, Conway's partner, who attended both meetings. McClure deleted the errant tweet, but not before it had been copied and widely distributed, including to TechCrunch.

TechCrunch ran McClure's tweet, and then they received a copy of Conway's email and ran that. McClure is continuing to respond to other postings around the web that agree with his point of view, when the best thing he could do right now is visit a foreign country with no Internet connectivity.

Whose story of what happened at the meetings is right, Arrington's or McClure's? Arrington didn't name any of his sources and didn't go into any specifics about what action(s) the group agreed to undertake (if in fact it agreed to do anything.) For his part, McClure didn't mention the fact that there were two meetings, not just one, in his response to Arrington's story, which certainly detracts from the authenticity of his account. Also, his tweet in response to Conway's email didn't say that Conway's or Arrington's charges were wrong, only that he (McClure) and other attendees were being thrown under a bus by Conway.

What conclusions can we draw from this mess (so far)? In the song "If I Was a Rich Man" from "Fiddler on the Roof", Tevye sings:
The most important men in town would come to fawn on me!
They would ask me to advise them,
Like a Solomon the Wise.
"If you please, Reb Tevye..."
"Pardon me, Reb Tevye..."
Posing problems that would cross a rabbi's eyes!
And it won't make one bit of difference if I answer right or wrong.
When you're rich, they think you really know!
Now we know: Being rich doesn't make you smart or give you common sense.
Enhanced by Zemanta

Thursday, September 23, 2010

Blockbuster enters Chapter 11 bankruptcy

As expected for months, Blockbuster has entered Chapter 11 bankruptcy in the U.S., and its equivalent in several other countries. Blockbuster says that its 3,000 U.S. stores will continue to operate as usual during the bankruptcy (but that's probably not true--see below). In addition, its video kiosk business is owned by NCR and isn't part of the bankruptcy.

Blockbuster has asked the Bankruptcy Court to allow it to give the movie studios and distributors that supply it with 80% of its revenues priority for repayment, so that they don't cut off the company's ongoing supply of content. Senior bondholders, including Carl Icahn, who bought approximately one-third of the company's bonds as of September 17th, will be compensated with common stock after the company is reorganized and will effectively own the company. Other creditors and shareholders, with the exception of the movie studios and distributors, will be wiped out by the bankruptcy.

Blockbuster will most likely use the bankruptcy to cancel the leases, layoff the employees and close many of its more poorly-performing stores. It will also use the bankruptcy to renegotiate leases for other locations, and possibly lower or eliminate some benefits and pensions. The company's hope is that by wiping out its debt and lowering its operating costs, it will be able to compete more effectively with Netflix and Redbox. However, given how far behind those two companies it is, it's hard to see how Blockbuster can catch up.
Enhanced by Zemanta

Wednesday, September 22, 2010

Roku steps up its game with the HD, XD and XDS

Earlier today, Roku updated its line of Internet set-top boxes with three new models: The Roku HD, XD and XDS. All three models are significantly smaller than the previous generation of Roku STBs, but the old Roku models weren't exactly huge.

The HD, priced at $59.95 (U.S.) has 720p HD with composite and HDMI video outputs and both wired Ethernet and wireless 802.11n WiFi interfaces. It uses the same remote control as previous Roku devices. The XD, priced at $79.95, has 1080p HD, but is otherwise identical to the HD. However, it comes with a new remote control with "instant replay" and "info" buttons. (More on that in a moment.)

Moving into the HD-XR's old $99.95 price slot its the XDS, which has 1080p video, composite, component and HDMI outputs, dual-band 802.11n (both 2.4 and 5 GHz) WiFi and a USB port for local playback of audio and video from a connected USB thumb or hard drive. Later this year, Roku says that it will offer a free software update for the XDS that will allow it to stream content from DLNA-compatible devices via the local network.

Like the XD, the XDS comes with the new remote control that has two additional buttons. The most important new button is "instant replay", which replays the previous 10 seconds of video every time the button is pushed, without requiring rebuffering.

Roku's new HD, at $59.95, is most comparable to the new Apple TV, which is also limited to 720p, and for most users, it's all they'll need. If you're serious about local media streaming, the XDS is the better choice than the XD. All three models are very competitive with Apple TV, and will remain so unless and until Apple adds more streaming content, and possibly apps, to Apple TV.
Enhanced by Zemanta

Panasonic's new Micro Four Thirds Lenses

Panasonic launched three new Micro Four Thirds lenses at Photokina:
  • The H-FT012, a 12.5mm/F12 3D lens. Photos taken or videos shot with DSLRs using the new lens can be viewed on Panasonic's Viera 3D HDTVs. The lens is priced at $249.95 (U.S.).
  • Panasonic's tiny H-H014 4mm / F2.5 aspherical pancake lens, which the company claims is the world's lightest interchangeable lens, is priced at $399.95.
  • Panasonic's new zoom lens, the 100-300mm/F4.0-5.6/MEGA O.I.S. H-FS100300, will sell for $599.95. (I was concerned that it was going to be about the same price as the new GH2 body itself, but it's actually quite a bit less expensive.)
All three lenses will ship in November.
Enhanced by Zemanta

Tuesday, September 21, 2010

Panasonic's GH2: Evolutionary, not revolutionary

Photokina is open, and Panasonic has officially released the GH2 DSLR, its next-generation HDSLR, priced at $899.95 (U.S.) for body only, $999.95 with a 14-42mm lens, and $1,499.95 with a 14-140mm lens. (Panasonic also announced a 100-300mm lens, but it looks like it will be almost as expensive as the GH2 body.) The camera will ship in December.

The GH2 has an 18 megapixel multiple aspect ratio imager (16 megapixels output), and an autofocus speed of 0.1 second. Like the GH1, the GH2 supports fully manual control in video mode. The camera's ISO range is 160-12,800, and Panasonic claims that the GH2's imager has 3 dB better noise and 200% better sensitivity than the imager in the GH1.

The GH2 captures 1080p at 60fps (NTSC) and 50fps (PAL), but outputs 1080i at both frame rates. Both the NTSC and PAL models also shoot and output 1080p at 24 fps, and 720p at 60 or 50 fps. The GH2 supports variable frame rates of 80%, 160%, 200% and 300%.

The GH2 has continuous full-quality video output from its HDMI port while the camera is recording. It's not clear if the camera overlays settings on the HDMI image while recording like the Canon DSLRs do, but if it doesn't (or if the overlays can be turned off), the GH2 would be the first DSLR whose HDMI output can be monitored and recorded for real applications. The camera doesn't have continuous autofocus in video mode, but it does have Touch AF in video. It has a rotatable, 460K LCD and a 1.4 megapixel electronic viewfinder. Finally, the GH2 has an audio input and stereo microphone, but professional users would likely be better off with an external audio recorder.

Based on its specifications alone, the GH2 is a good news/bad news story. The 18 megapixel imager should result in lower light sensitivity, but Panasonic claims that it's achieved significantly better sensitivity than the GH1. The higher pixel count of the GH2 should also increase rolling shutter problems, and the GH2 doesn't have the features of the AG-AF100 that are designed to minimize rolling shutter, so it remains to be seen how the GH2 performs. Panasonic claims that the HDMI output works when the camera is recording, but it's not clear if it's exactly the same image as that being recorded on the GH2's memory cards. The GH2 captures 1080p but outputs 1080i. It doesn't have continuous autofocus in video mode, but it does have one-touch autofocus.

Once the real-world reviews start coming in, we'll know more about the video performance of the GH2, but at least on paper, it seems to address most of the shortcomings of the GH1 at a more aggressive price.
Enhanced by Zemanta

Sunday, September 19, 2010

Photokina starts on Tuesday--here's where to find news

Photokina, the biannual photographic business and technology conference, will start in Cologne, Germany on Tuesday, 21 September 2010. The U.S. press release services don't seem to be paying much attention to the show, but the Photokina show organizers themselves are offering a list of press releases from exhibitors, plus a database of new product announcements. As I find additional press release services covering the show, I'll add links to them to The Feldman File site.
Enhanced by Zemanta

When "a better mousetrap" isn't enough

You've heard the saying "Build a better mousetrap, and the world will beat a path to your door." It's the mantra of many engineering-driven organizations. Unfortunately, it's not true. We all know examples of products that clearly were technologically inferior but that went on to great market success. In the U.S., one of the best examples was Sony's Betamax vs. Panasonic's and JVC's VHS. Most industry observers felt that Betamax was the better product--it certainly had better video quality. However, Betamax had two quality modes that allowed either one or two hours of recording on a single tape. VHS, on the other hand, had three modes that allowed one, two or three, and eventually two, four or six hours of recording. The picture looked better on Betamax, but consumers could purchase many fewer tapes with VHS, so it was seen as a much better value.

Sometimes, the most important innovations from technology companies have little or nothing to do with technology. In my opinion, Microsoft's two greatest innovations, the two things most responsible for its success, were software suites and per-machine pricing.

In the early days of the PC industry, consumers purchased applications one-at-a-time, based on their needs. If you wanted a word processor, WordPerfect and Wordstar were the preferred choices. The most popular spreadsheets by far were initially VisiCalc and then Lotus 1-2-3 and Borland's Quattro Pro. Microsoft had its own word processor, Word, and its own spreadsheet, first Multiplan and then Excel, but neither one was overtaking the market leaders. Then, Microsoft had the brilliant idea of bundling all of its office productivity applications together and selling them at about the same price as a copy of its competitors' single-purpose applications. The result was Microsoft Office.

Consumers immediately saw the value in Office. They might have preferred WordPerfect as a word processor or 1-2-3 as a spreadsheet, but for the same price, they could get a word processor, a spreadsheet and a presentation tool (PowerPoint.) Microsoft Office and its applications quickly dominated the market. Competitors tried to respond by acquiring other products to create their own suites, but Microsoft's market dominance was never challenged.

Microsoft's second innovation was per-machine licensing. Let's say that you were a large PC manufacturer, and you had a choice of a variety of operating systems--in particular, Microsoft's PC-DOS, Digital Research's DR-DOS and IBM's OS/2. Each one of the companies would sell you their operating systems at a price based on the total number of copies that you purchased. However, Microsoft came up with a unique new pricing model based not on the number of copies of PC-DOS that you shipped but rather, the number of computers that could run PC-DOS that you shipped. It cost much less per unit to license PC-DOS under this new model, but you had to buy a copy for every computer you built that could run PC-DOS.

This model almost immediately squeezed Microsoft's competitors out of the business of selling to computer manufacturers. PC-DOS was the "industry standard" and customers expected it. If you also wanted to offer DR-DOS, which many people thought was superior to PC-DOS, you had two choices:  Buy only the copies of PC-DOS that you needed, at a much higher price that you'd have to pass on to consumers, or buy two operating systems--PC-DOS for every machine, plus DR-DOS for some models. Very quickly, manufacturers decided that PC-DOS was good enough, and it wasn't worth raising prices or buying two copies of operating systems and throwing one away in order to offer a choice.

When Microsoft launched Windows, which was originally an add-on to DOS, it did the same thing: Manufacturers who wanted the lowest prices had to license DOS and Windows together for every machine that could run them. Competitive graphic environments such as GEM and Go didn't have a chance. This pricing model, more than anything else, built Microsoft's monopoly in desktop operating systems.

Have you ever wondered why Intel processors have been used by the vast majority of computer manufacturers for years, even when AMD had equivalent (or better) processors at lower prices? One big reason was that Intel was paying computer manufacturers under the table not to use AMD's processors (a fact that was recently admitted by Intel and Dell,) but Intel had another, above-board tool for getting buy-in. That Intel "Bum-bum-bum-bum" sound that you hear at the end of many PC commercials? Those commercials are paid for in large part by Intel. Through the use of co-op agreements and spiffs (sometimes called "sales promotion incentive funds), Intel reviews the commercials, and if they're approved will often pay 50% or more of the cost to air the ads. For the PC manufacturers, it's like doubling their advertising budgets. Intel won't approve payments for any ads that mention any computers using a competing processor, so it's a strong incentive to stick with Intel.

That brings us to a current example: How much is it worth to put the "Google" logo on your smartphones and have access to the Android Market? If you're Motorola or Samsung, it's apparently worth quite a lot. According to a lawsuit filed last week by Skyhook, a geopositioning technology company, Google withheld its approval for usage of the Google logo and access to the Android Market in order to force both companies to drop Skyhook in favor of Google's own positioning services. According to Skyhook, Google operates an "Android Compatibility Program," and products must be approved by this program in order to carry Google trademarks, license Google applications and gain access to the Android Market.

The Android Compatibility Program has two components: The Compatibility Test Suite, a software test suite that tests whether the submitted hardware and software are compatible with published Android specifications, and the Compliance Definition Document, which has additional requirements for what constitutes full compliance with Android specifications. According to Skyhook, the Compatibility Test Suite is an objective test that can be run by manufacturers and gives "go/no-go" answers, while the Compliance Definition Document is an amorphous, subjective document that can be freely interpreted by Google employees.

Skyhook claims that when Motorola submitted a phone that incorporated Skyhook's geolocation system, Google demanded that Skyhook share its geolocation information with Google in order to get approval. When Skyhook refused, Google then demanded that Motorola's phone run Skyhook's and Google's own geolocation systems simultaneously, which would have used far too much power and would have been impractical. Google additionally demanded that whenever the Skyhook system was in use, the phone's user had to be warned that their location data was going over a third-party network and might not be secure. After Skyhook refused to implement this final specification, Google demanded that Motorola remove the Skyhook system completely from its phone in order to get certification, and Motorola complied.

A second company, named "Company X" in Skyhook's lawsuit that is most likely Samsung, also adopted the Skyhook technology, and initially received shipping approval from Google. However, Motorola learned of the decision and requested that it be allowed to reinstate Skyhook's technology, at which point Google withdrew its approval to ship and reinstated it only after Company X removed Skyhook's technology from its phone.

As you can see, it's usually the exception when the better mousetrap wins, not the rule.
Enhanced by Zemanta

Friday, September 17, 2010

A clarification of my Intel Sandy Bridge and Light Peak stories

I wrote two stories on Sandy Bridge, Intel's forthcoming Core architecture, and Light Bridge, Intel's new optical interconnection technology that will compete with USB 3.0, earlier this week. EETimes Magazine then reported that while Light Peak components will ship in 2010, systems using Light Peak probably won't be available until 2012, and I updated both posts. Unfortunately, I had Sandy Bridge on the brain when I made the changes, and wrote that Sandy Bridge systems won't be available until 2012. That's incorrect. Here are the facts as best I know them (today):
  • Sandy Bridge versions of Intel's Core i3, i5 and i7 processors will ship in Q1 2011, and systems using them should be available shortly after--certainly by the end of Q2 2011.
  • Light Peak components will ship some time in 2011, but Intel doesn't expect them to be integrated into systems until 2012.
Sorry for the confusion.
Enhanced by Zemanta

Not much on the Android tablet front for this holiday season?

Two pieces of news point provide some insight into the timing of the next major release of Google's Android, and its impact on Android tablets for the coming holiday season. Yesterday, Samsung formally announced that all four major U.S. mobile carriers will sell the Galaxy Tab in the U.S., starting before the end of the year. However, the company said that the Galaxy Tab will ship without 4G support and will not work as a phone. Today, Motorola announced in the Wall Street Journal that it will delay introduction of its first Android tablet until next year.

Let's put these two announcements in perspective: Google has said that the current version of Android, 2.2, also called Froyo, is not appropriate for tablets. While Google will approve access for devices to its Android Market on a case-by-case basis, the company has called for tablet vendors to wait for the release of Android 3.0, also called Gingerbread. Motorola Co-CEO Sanjay Jha seconded Google's recommendation, saying that he doesn't believe that Froyo is appropriate for tablets and that Motorola will wait to release its tablet until Android is ready.

In yesterday's announcement, Samsung said that while the Galaxy Tab will run Froyo, it has written apps specifically for the device to take advantage of its capabilities. Samsung warned that most existing Android Market apps won't run properly on the Galaxy Tab.

Consider one additional item: Verizon is launching its LTE 4G service in 30 cities in the U.S. before the end of the year, and Motorola was widely reported to be supplying a tablet for that launch. Therefore, there was an assumption that Google would release Gingerbread in November. However, we now know that the Motorola tablet won't ship until 2011 and that Samsung won't support 4G in the Galaxy Tab when it ships later this year.

Put it all together, and it's fairly clear that Gingerbread won't ship until 2011. That means that all of the Android tablets that ship this year will be using an operating system unsuited for tablets, and will have limited or no access to the Android Market. For these reasons, I believe that the forecasts for big sales of Android tablets this holiday season are going to have to be scaled back considerably.

Apple will continue to have a largely open field, unhindered by significant competition, until next year. I have no idea if there's any truth to the rumors that Apple will launch a 7" iPad in time for the holiday season, but if they do, it will only add momentum to Apple's tablet business and push competitors deeper in the hole for 2011.
Enhanced by Zemanta

Thursday, September 16, 2010

PR Lesson 101 for Samsung: How to frustrate your audience

Samsung just concluded its U.S. announcement of the Galaxy Tab. The company had hyped the event online, and there was some live-blogging activity. Keep in mind that the Galaxy Tab was announced and demonstrated in detail more than a week ago at IFA in Berlin, and that announcement was widely covered. What the press, bloggers and early adopters were expecting to get from Samsung today were answers to three simple questions:
  • Which carriers will sell the Galaxy Tab,
  • What price(s) will they sell it at, and
  • When will it be available?
Samsung only answered the first question. All four major U.S. carriers (Verizon, AT&T, Sprint and T-Mobile) will carry it. The prices will be set by the carriers, and Samsung completely refused to answer any questions about specifics. Not a single carrier spoke at the event. As for availability, all they were willing to say was "before the holidays."

There were only two hardware announcements of note: First, Samsung will sell a (presumably unsubsidized) WiFi-only model in addition to the 3G/WiFi models that the carriers will sell, and second, the Galaxy Tab will not have voice calling capabilities. That's it.

This is PR 101: If you can't answer the questions that your audience is most expecting answers to, don't bother staging the event. When the event dragged on and on without any mention of availability or price, I was afraid that Samsung would do exactly what it ended up doing. It's shifted the burden on answering the key questions to its carriers, who are no doubt thrilled about it.

Samsung should have sent out a press release launching its website and posted videos and specifications of the product, including the video produced by Adobe, instead of having this event. The next time around, when the product is actually ready, far fewer people are going to pay attention to the announcement(s).
Enhanced by Zemanta

Logitech's Google TV set-top-box price and availability date announced

According to Engadget, Logitech has announced that its Google TV set-top box, the Revue, will be priced at $299 in the U.S. and will ship on September 29th. Dish Network subscribers can purchase a single Revue at a discounted price of $179. My suspicion is that Logitech is going to have a very tough time of it this holiday season.

The Revue was clearly designed to be competitive with the last-generation Apple TV, but the new version, which will be shipping at about the same time, will be priced at $99. Roku's current-generation set-top boxes run from $59.99 to $99.99, although they have new-generation models in the pipeline that may be more expensive. Boxee's Boxee Box by D-Link has a $229.99 list price, but Amazon has it available for pre-order at $199.99.

Looking at the Logitech Revue's price, I don't see how it competes in the current environment. By basing its architecture on Intel, Google automatically made its partners vulnerable to price competition from companies using ARM processors, and that's exactly what's happening. If the price point for a viable add-on set-top box drops to $199 or less (and especially if it drops to $99), I don't see how Google TV set-top boxes will be able to compete without subsidies.
Enhanced by Zemanta

Tuesday, September 14, 2010

Nikon's D7000 is official!

Specs and prices for Nikon's new D7000 are out, thanks to Engadget. Here are the key specs for video DSLR users:
  • ISO range 100-6,400 with 25,600 extended range
  • Dual SD memory card slots
  • 6fps still frame burst rate
  • 1080P at 24fps, 720P at 24 and 30fps
  • Continuous autofocus in video mode
  • Maximum recording time 20 minutes per clip (multiple clips can be recorded if there's sufficient memory)
  • Built-in mono microphone with jack for external stereo mic
And now, the price: In the U.S., the list price of the D7000 without lens will be $1,199.95, and $1,499.95 for a D7000 with a 18‐105mm f/3.5‐5.6G ED VR lens. Nikon will begin shipping the camera next month.

The D7000 is priced just above Canon's 60D and well below the 7D, and it has continuous video autofocus that neither the 60D nor 7D have. Once formal reviews start coming out, we'll know much more.
Enhanced by Zemanta

Intel's Sandy Bridge...and a little Mac (and Final Cut Pro) speculation

At Intel's IDF Conference, the company provided more details about its new Sandy Bridge architecture. According to Anandtech, the first Sandy Bridge processors will be shipped for performance-level (gaming and media creation) PCs in early 2011, and will migrate to entry-level PCs in 2012. One major change in Sandy Bridge is that its GPU core should provide performance easily double that of Intel's existing integrated graphics. It won't keep NVIDIA or AMD up at night, but it should be good enough to lessen the demand for add-on graphics cards in entry-level PCs.

Sandy Bridge has an integrated MPEG2, VC1 and H.264/AVC decoder that Intel claims will use only half the processor power for HD playback as existing processors. It also has a AVC encoder/transcoder that, in a demonstration, was able to transcode a three minute 1080P 30Mbps video into a 640 x 360 iPhone video in 14 seconds, at a rate of 400 fps. This gets into the performance range of high-end, GPU-accelerated encoders. Sandy Bridge will also have an enhanced Turbo Boost feature that will allow the clock speed of individual cores to be boosted beyond the normal thermal design power (the maximum safe power dissipation of the chip) for brief periods of time.

Okay. so I said something about the Mac in the title, right? According to Anandtech, Core i3, i5 and i7 processors with Sandy Bridge architectures will ship in Q1 2011. Every MacBook Pro, iMac and Mac Pro ships today with at least a Core i3. Next, let's add Light Peak, Intel's new 10Gbps optical competitor to USB 3.0. There's no support for Light Peak in the first Intel chipset announced for Sandy Bridge, but that doesn't mean that it can't be added.

Add it all together, and it sounds like the Apple notebook and iMac product lines will be fully refreshed some time next year with a combination of Sandy Bridge and Light Peak. There goes the I/O limitations of Apple's notebooks and desktops. (Apple won't have to reengineer the Mac Pro right away--they can simply offer a PCI Light Peak card.)

Let's throw one more thing into the mix. There's been a lot of rumor and speculation surrounding the next release of Apple's Final Cut Studio, with a battle between bloggers--some saying that a new version will be released no later that NAB in April 2011, and others saying that it won't happen until 2012. The "2011" school says that the new version of Final Cut Studio will have Adobe Mercury Engine-like performance, but the "2012" school says that's not possible without major architectural changes. If Apple is writing the next version of Final Cut Studio to take full advantage of the features in Sandy Bridge, it most definitely is possible in the 2011 timeframe.

So here's my semi-informed speculation: In Q1 (perhaps late Q1), we'll see the first updated MacBook Pros with Sandy Bridge and Light Peak announced. New iMacs will follow. Then, in the April timeframe, just before NAB, Apple will announce the new Final Cut Studio that takes full advantage of the new computers ("great time to upgrade!"). Deliveries of everything will occur by the end of Q2.
Enhanced by Zemanta

Is USB 3.0 a dead end? Intel thinks it is

The Intel Developer Forum is going on in San Francisco. Prior to the start of the conference, observers thought that Intel would announce support for USB 3.0, the new high-speed version of the interface, in a forthcoming chip set. Intel agreed to support the USB 3.0 specification only this summer. However, at IDF, there's precious little USB 3.0 to be seen.

Instead, Intel demonstrated its Light Peak technology on the floor. Light Peak is a fiber-optic interconnection technology that uses ports and connectors that look very similar to, but are incompatible with, USB 3.0. Light Peak has a theoretical maximum transfer speed of 10Gbps on each port, and it can be daisy-chained from device to device. At IDF, Intel demonstrated Light Peak streaming uncompressed 1080P HD to a modified Samsung HDTV at a sustained rate of almost 770MB/second (approx. 6.16Gbps.)

Intel emphasizes the protocol-neutral nature of Light Peak. HDMI, IP. DVI, SDI and other protocols can be sent over a Light Peak connection. Consider that a single Light Peak connection could carry three 3Gbps HD-SDI signals. With the right adapters, future PCs and other devices will no longer be limited in their I/O capabilities. Intel believes that it will deliver Light Peak connections capable of 100Gbps by the end of this decade.

When Intel executives were asked whether Sandy Bridge, the next-generation Intel Core architecture scheduled for delivery by mid-2011 will support either USB 3.0 or Light Peak, they demurred, but they did go on the record saying that Light Peak will be shipping in volume next year. (Update, 16 September 2010: EETimes reports this morning that while Intel plans to ship Light Peak components in 2011, it doesn't expect to see systems implementing Light Peak until 2012. Corrected--the previous version of this sentence said "Sandy Bridge.")

Does this mean that USB 3.0 is a dead end? Not necessarily, since it has a head start in the market and is backward-compatible to USB 2.0. However, if Intel does succeed in shipping Light Peak components next year in quantity, my bet is that we'll see a major move to Light Peak for high-performance connections. Gaming and media creation systems as well as servers will likely adopt Light Peak, as component costs are likely to be high for at least the first few years, and those systems can support higher end-user costs.
Enhanced by Zemanta

Monday, September 13, 2010

Piecing the Android tablet story together

I'm trying to clarify a few things about Google's plans for Android and what they mean for future devices. I've written a couple of entries about Samsung's Galaxy Tab, a 7" Android tablet that runs Android 2.2, also called Froyo. At IFA, Samsung demonstrated access to the Android Market using the Galaxy Tab. However, TechRadar.com reported the following quote:
"Android is an open platform. We saw at IFA 2010 all sorts of devices running Android, so it already running on tablets," said Hugo Barra, director of products for mobile at Google. "But the way Android Market works is it's not going to be available on devices that don't allow applications to run correctly. Which devices do, and which don't will be unit specific, but Froyo is not optimised for use on tablets. If you want Android market on that platform, the apps just wouldn't run, [Froyo] is just not designed for that form factor. We want to make sure that we're going to create a application distribution mechanism for the Android market, to ensure our users have right experience."
The Galaxy Tab runs Froyo, yet it had access to the Android Market at IFA. Other vendors have claimed that their tablets have access to the Market, only to back away from the claim once they start delivering products to reviewers and customers. I find it difficult to believe that Samsung would announce something as important as support for Android Market without confirming it with Google.

The key seems to be that the Galaxy Tab can work as a conventional, albeit huge, smartphone. That's how Samsung can make the Android Market available to Galaxy Tab users. However, if you take Mr. Barra's comments at face value, Froyo isn't optimized for tablets. At IFA, Samsung said that the Galaxy Tab will be updated with Gingerbread, the next major version of Android, when it's available. However, previous experience shows that it can take a good deal of time for manufacturers to update products with the latest version of Android.

The Galaxy Tab might be a safe purchase, but for risk-adverse consumers, the best course of action might be to wait until vendors are shipping tablets with Gingerbread already installed.
Enhanced by Zemanta

Ivi TV: Let's see how long this lasts (Updated with new information)

Ivi, Inc., a Seattle-based company, has launched what it calls a "revolutionary" live television application. For $4.99 (U.S.) per month, they will stream the live feeds from 16 over-the-air television stations in New York City and 10 from Seattle straight to your computer. Their feeds include ABC, CBS, Fox, NBC, PBS, Telemundo, Univision and other affiliates, as well as independents.

Update, 14 September 2010: Ivi seems to be trying to take advantage of U.S. Copyright law that was written well before the advent of the Internet, while simultaneously avoiding FCC rules that make the company's plans patently illegal. (Keep in mind that I'm not a lawyer, so I'm bringing a layman's knowledge to the situation.) In the Code of Federal Regulations, Title 37, Section 201.17, "cable systems" as defined by this statute are entitled to retransmit ("secondarily transmit") television stations' signals under a statutory (or "compulsory") license. The cable system pays a royalty based on its revenues to the U.S. Copyright Office. This portion of the statute was written in 1978, almost 20 years before the commercialization of the Internet, and it didn't contemplate a technology that would make a national cable service feasible outside of FCC regulations.

The FCC has its own rules on permission and compensation for retransmitting the signals from broadcast television stations. Here's a direct quote from the FCC's Fact Sheet on Cable Carriage of Broadcast Stations:
The Communications Act prohibits cable operators and other multichannel video programming distributors from retransmitting commercial television, low power television and radio broadcast signals without first obtaining the broadcaster's consent. This permission is commonly referred to as "retransmission consent" and may involve some compensation from the cable company to the broadcaster for the use of the signal.
If ivi is a cable operator or other multichannel video programming distributor, the FCC's rules require the company to get permission from broadcasters before they retransmit their signals.

Under Title 37, Section 201.17, ivi claims that it's a cable system and has a right to a statutory license to television stations' programming, no matter where they're located. However, ivi claims that it's not subject to regulation by the FCC, and therefore is not a cable system. CFR 37 Section 201.17 says that cable systems are entitled to statutory licenses, even if they're not defined as cable systems by the FCC.

So, what's likely to happen? My suspicion is that there are many high-paid attorneys at the television networks and cable operators working on this right now. One option would be to get the U.S. Congress to amend or repeal Section 201.17, since the FCC's rules supercede it. Another option would be to get the FCC to rule that ivi is a legitimate cable operator, and the fact that it owns no plant doesn't mean that it's free from FCC regulation. A third option would be for programming suppliers (the major networks, movie studios and syndicators) to file suit against ivi, charging the company with interfering with their exclusive distribution contracts with local television stations outside the New York City and Seattle markets. These suppliers could petition for an emergency injunction to shut down ivi's service.

Ivi might have a better case than I anticipated when I first wrote this post, but I still believe that it's only a matter of time before it gets shut down. It may take a year or two for the necessary statutory changes to be put into place, but a preliminary injunction can be put into effect in a matter of weeks, or even days.
Enhanced by Zemanta

New AppleTV uses A4 processor; what impact will ARM's A15 have?

According to the latest EETimes magazine, the new AppleTV uses Apple's own A4 processor in place of three Intel chips in the previous design: A Pentium M-class CPU, a memory controller with embedded graphics, and an I/O chip. Those three chips cost Apple $60 to $65 and took up 975 square mm of space. By comparison, the A4, manufactured by Samsung, costs Apple $15 to $20 and takes up 196 square mm of space.

The switch to the A4 explains why Apple won't port the new AppleTV user interface or features to the previous model. It also puts Intel at an increasing disadvantage in Apple's future product plans, since only the Mac family now uses Intel processors.

Another story in the same issue of EETimes has a great deal of potential importance to Apple's future desktop computer plans. ARM, the RISC processor developer that licenses its designs to a variety of semiconductor companies, disclosed a small amount of information about its new 2.5 GHz Cortex A15 processor. This processor is multicore-capable and uses much less power than the current generation of server chips from Intel and AMD. Apple could build a notebook- and desktop-oriented version of the A4 using the Cortex A15 as its foundation, and displace Intel from its product line entirely. The A15 is scheduled to begin shipping in 2012.
Enhanced by Zemanta

Samsung's Galaxy Tab to be announced in the US this week?

According to FierceWireless, Samsung will formally announce its U.S. carrier partners and launch date(s) for the 7" Galaxy Tab Android-based tablet this Thursday, September 16th, in New York. Bloomberg reports that AT&T, Sprint and Verizon will all carry the Galaxy Tab, and that the price of the device will be $200 to $300, depending on carrier subsidies. According to Bloomberg, AT&T and Sprint have decided on their subsidies, while Verizon is still uncertain.

The Galaxy Tab can be used as a smartphone, but given its size, most customers are likely to keep their existing phones. If carriers require a full phone account in order to get a subsidized Galaxy Tab, it's likely to lose a great deal of its potential market. On the other hand, if the carriers offer both reasonably-priced data-only subscriptions along with conventional voice and data plans, interest in the tablet should translate into significant sales.

Previous reports indicated that the Galaxy Tab will ship next month in Europe, and that it will ship in the U.S. in November, in time for the holiday sales season.
Enhanced by Zemanta

Sunday, September 12, 2010

Canon's rumored EIS mirrorless DSLR

EOSHD.com has broken preliminary details of Canon's answer to Panasonic's and Olympus' Four-Thirds and Sony's NEX series: The mirrorless EIS (Electronic Image System). The first model, the EIS 60, is rumored to be scheduled to ship in calendar Q2 2011. The EIS 60 is said to have a 22 megapixel imager, which should immediately be a cause for concern for a couple of reasons:
  • More pixels means less surface area (and thus less light sensitivity) per pixel.
  • The rolling shutter problem that DSLRs have in video mode is due to the process that they use to get the resolution of the image data coming off of the sensor down to 1920 x 1080. In most cases, manufacturers simply throw most of the lines of image data away, but the result is the "jello effect" when either moving the camera or shooting fast motion. Tossing lines away also results in image softness and moire patterns, especially when shooting objects with close horizontal or vertical lines, such as brick buildings and striped clothing.
To get around both problems, Canon has reportedly developed a technique called "Pixel Fusion" that merges a matrix of pixels together to form a single pixel. In still mode, 4 pixels (2 x 2) are merged into a single pixel, for a net resolution of 5.5 megapixels. In video mode, 9 pixels (3 x 3) are merged into a single pixel, for 1920 x 1080P resolution. The result is that the EIS 60 can achieve better speed in still mode (up to 20 fps) with excellent low-light performance (in "Pixel Fusion" mode, the extended maximum ISO of the EIS 60 will be 25,600,) and can read the data off the imager much faster in video mode in order to avoid the rolling shutter effect.

EOSHD.com reports that the new EIS format will use a new lens mount, but Canon will also offer a EF-to-EIS adapter. The lenses under development for the new format (some of which won't be available at launch) include:
  • 12-75mm F2.8-4 IS Macro
  • 70-300mm F3.5-5.6 IS
  • 5mm F4 fisheye
  • 8-25mm F4 wide-angle zoom
  • 14mm F2 pancake
  • 25mm F1.2 pancake
  • 45mm F1.5 pancake
  • 65mm F2.0 Macro (1:1, 2:1 is equivalent to full-size)
Obviously, all of this should be taken with a grain of salt until Canon makes an official statement. If this camera is actually under development and the timing is correct, it's unlikely that Canon will announce anything about it at Photokina later this month to avoid hurting sales of its cameras for this holiday season. I don't expect any confirmation from Canon until January at the earliest, but we may also hear about a true camcorder using the same imaging system at NAB next April.
Enhanced by Zemanta

Saturday, September 11, 2010

The future is coming early to college bookstores

The trend toward print-on-demand books in college bookstores is gathering momentum. Last month, I wrote about the Espresso Book Machine, and now Hewlett Packard is getting into the act. Both Arizona State University and the University of Arizona have installed print-on-demand systems in their university-owned and operated bookstores. ASU is using the new HP system, while U of A has used an Espresso Book Machine for a year. (Portland State and the University of Kansas are also testing the HP system in their bookstores.)

The biggest goal of both universities is to drive down the cost of college textbooks. In the three largest Arizona universities, students will spend an average of $3,200 for textbooks and supplies over four years. Nationally, the average price of new textbooks went up 12% in 2009, far faster than inflation. Only a small selection of titles are available for print-on-demand at the two universities, but according to Dennis Mekelburg, the associate director of ASU Bookstores, if his school can replace conventional printed textbooks and materials with print-on-demand for 5% of its courses, students would save $500,000 per semester.

McGraw-Hill, John Wiley & Sons and Cengage Learning are supplying a small selection of their titles to ASU for print-on-demand, and the savings can be significant. One Marketing instructor at ASU switched from a conventionally-printed textbook and workbook that sold last year for $250 new to a softcover version of the same textbook and a workbook printed on demand, together priced at about $62.

In the long run, I believe that the most textbooks will be sold as electronic versions, and print-on-demand will be used to provide paper copies for special situations. The sophistication of eTextbooks and the devices used to read them is increasing so quickly that paper textbooks will soon be considered relics.

One big question is what the impact of these changes will be on the bookstores themselves. To date, the print-on-demand phenomenon seems to be limited to schools that own and operate their own bookstores. Publishers are demonstrating a willingness to negotiate better deals directly with colleges and universities, and some schools are in turn showing a willingness to cut the private operators of their bookstores out of the channel in order to lower the prices that their students pay for textbooks and materials. (In the U.S., the largest private college bookstore operators are Barnes & Noble and Follett.) If the college bookstores no longer sell textbooks, or no longer make any money selling them, what will be the role of the private operators? Probably not much. In fact, I wouldn't at all be surprised to see general retailers, anyone from 7-11 to Wal-Mart, compete for the contracts to run these "bookstores". They're likely much better positioned to sell general merchandise than the existing private bookstore operators.
Enhanced by Zemanta

Friday, September 10, 2010

Check list marketing, or why vendors put useless features into their products

Yesterday, I wrote a post about Pentax's new K-r DSLR. Like all new DSLRs, it has a video mode, but the K-r's is fixed at 720P at 25 fps. Fine for Europe, but completely useless in the U.S. Pentax isn't the only offender, of course; it's virtually impossible to find a DSLR from anyone that has usable audio, which is why the Zoom H4n audio recorder is so popular.

Why do manufacturers add features to their products that don't work or are useless? It's usually due to "check list marketing." You've probably seen the comparison lists that show how the features and functions of various products compare. The lists are almost always put together by a vendor to show how much better their products are than the competition's. No one wants to look bad on one of these check lists, so the sales team or product managers will push the engineering team to add features. The engineering team will usually resist, but sales and product management will insist that engineering implement the features in some way, so that they can add them to their check lists.

That's probably how the K-r got its video mode. The website and data sheet for the camera boast that it has HD video. Yes, at the lowest resolution that can be called HD, with a frame rate that's useless in North America. They don't say that in the headline, of course; you have to read down to the specifications to find out the bad news. The check list only says "HD video". Canon? Check. Panasonic? Check. Nikon? Check. Sony? Check. Pentax? Check. It didn't say "HD video that you can actually use," or "HD video that you can edit," or "HD video that won't make you throw your camera through a plate glass window." There would be some checks missing on that list, and not just for Pentax.

I'd rather see companies implement features the right way, and have the courage to leave out features that can't be done well, rather than implement useless features simply to fill out a check list. If Pentax's engineers didn't have to implement video mode, could they have used that time and those resources to make the still features of the camera even better? There are many photographers who buy DSLRs for their ability to shoot stills and couldn't care less about video. It would have been a retro step by Pentax, but in this case, it would have been the right thing to do.
Enhanced by Zemanta

Thursday, September 09, 2010

Pentax announces its K-arrgh DSLR

When I learned photography many years ago, my second SLR was a Pentax, so I've always had a soft spot for their cameras. They've just announced a new DSLR, the K-r, and it looks like a very good still camera for the price ($849.95 in the U.S. with an 18-55mm lens.) Unfortunately, its video mode is crippled: 720P at 25fps. Given that it's priced about the same as Canon's T2i/550 and significantly more than Nikon's D3100, it's not competitive on the basis of its video mode. I can't see why anyone would opt for the K-r, even if their only interest is still photography.
Enhanced by Zemanta

Are cable and IPTV operators going to "back into" a la carte?

NewTeeVee ran an article last week about AT&T's decision to drop Crown Media Holding's Hallmark Channel and Hallmark Movie Channel from its 2.5 million U-verse IPTV households in the U.S., effective September 1st. As of this writing (September 9th,) the channels haven't been reinstated. NewTeeVee said that a JPMorgan research report from last April found that there are only 50 cable networks in the U.S. for which 10% or more of cable subscribers would switch service providers in order to watch them. The Hallmark channels aren't within that group of 50 essential networks.

Cable, satellite and IPTV systems have the capacity to carry hundreds of channels, and system operators have been racing each other to offer the most networks. New and small cable networks can get carriage on cable systems by giving their programming to cable operators at no cost or paying to have their programming carried (called "reverse compensation".) However, most of those networks get little viewership, and the service providers are essentially wasting bandwidth by carrying them.

Cable operators have successfully fought the FCC for years to prevent the imposition of rules that would require them to allow viewers to choose, and pay for, only the channels they want to watch (often called "a la carte".) However, as they begin to drop lesser-watched channels in order to save on retransmission fees, the operators of those networks will put political pressure on the FCC to protect them.

Cable and IPTV operators could turn this situation to their advantage (satellite operators have less flexibility, due to their technology.) As cable operators move to switched digital video and IP transport, they will have the ability to send any channel (or set of channels) to any subscriber. (IPTV operators like AT&T and Verizon can do this today.) These operators could offer to make small networks available on an a la carte basis in return for revenue sharing (for example, 70% of the fee would go to the cable operator and 30% to the network.) There would be some upfront costs to the network for making their content available to subscribers.

The "top 50" essential networks wouldn't be subject to a la carte, so those network operators would be unlikely to fight the move with the FCC or U.S. Congress. Some customers might actually save money by buying a few channels a la carte.  The cable and IPTV operators could position the move as sensitivity to consumer demand. They would also have more latitude to remove unpopular channels, giving them more flexibility to offer HD and 3D channels, and more bandwidth for high-speed Internet services.


Enhanced by Zemanta