There are forces at work that are going to completely change the television business in the US. On the one side there are the major television networks who believe that it is their right to earn large sums of money from television, just because they have in the past. On the other side is the consumer who is tired of the cost and increasingly switching off. In the middle are the cable companies and the cable content companies.
The consumers are fed up. Television has become unwatchable as the number and length of the commercial breaks has extended. We used to get 48 to 50 minutes of content in each hour, and now we get just 42 minutes. At that rate a season of 24 has less than 17 hours of content. The only way to watch a TV show is to record it on a DVR and watch later, skipping the commercials. Once we get in the habit of watching TV offline, it becomes much easier to cut the cable completely and just watch the web. Between Netflix, Hulu and YouTube there is quite enough stuff to keep entertained.
Another source of complaint is the constantly rising cost of cable. This is caused by the cable company paying more and more for content. For example, the cable companies pay ESPN $4 per month per viewer to carry the channel, and that fee is rising. Other cable content companies are jumping into the valuable content pool. Ten years ago, the AMC channel used to show very old movies with no commercial breaks, now AMC puts on award winning shows like Mad Men full of commercials. Every cable channel seems to have its must see TV program from the BBC with Top Gear through the the USA network with Burn Notice.
The cost of cable is about to go up sharply as the major TV networks demand commensurate fees for their programming from the cable companies. This does not seem like a winning idea in recessionary times. As fees rise more and more people will cut the cable. Either the cost of cable has to stabilize with cuts to content, or TV risks going the way of radio. (I hear that radio still broadcasts, but I do not listen to it, and nobody that I know still listens.) I think that we will see some big changes coming to TV business over the next year or so.
Thursday, December 31, 2009
Sunday, December 27, 2009
Kindle Chronicles
Amazon announced that "On Christmas Day, for the first time ever, customers purchased more Kindle books than physical books." Well duh! If you want a physical book for Christmas, you have to buy it before Christmas day. On the other hand, every one who received a Kindle as a gift used the wireless book download feature to get a book to read on Christmas day. In the very same announcement, Amazon said that "Kindle has become the most gifted item in Amazon's history". Amazon's statement is a nice piece of spin but not a lot more.
More interesting commentary on electronic book readers is found in the Kindle Chronicles blog. In the early days of emusic, musicians generally stood by their record companies. Book authors seem to be a much more independent lot according to the most recent post "What We Have Here Is a Failure To Communicate". The publishers have been trying to preserve their position by keeping the prices of ebooks high, while the authors want to be read and the books that sell most on the Kindle are the cheaper ones. Also authors do not see why the publishers should get such a large share of the revenue when there is no cost to their ebook inventory.
More interesting commentary on electronic book readers is found in the Kindle Chronicles blog. In the early days of emusic, musicians generally stood by their record companies. Book authors seem to be a much more independent lot according to the most recent post "What We Have Here Is a Failure To Communicate". The publishers have been trying to preserve their position by keeping the prices of ebooks high, while the authors want to be read and the books that sell most on the Kindle are the cheaper ones. Also authors do not see why the publishers should get such a large share of the revenue when there is no cost to their ebook inventory.
Saturday, December 26, 2009
Die AutoRun Die
Another year has almost passed and I have not yet ranted about an awful, unnecessary and totally annoying feature of Microsoft Windows, so today I am going to tell you why AutoRun should die.
AutoRun is the "feature" where you plug something into your computer and then stuff happens completely out of your control. The thing you plug in might be key drive, camera, iPod or whatever. Last Christmas I won a 4GB SanDisk Cruzer USB key drive as a door prize. When I plugged this horrible little thing into my computer it installed the U3 driver with useless and dangerous functions that I DO NOT WANT! To make matters worse, there is no obvious way to remove the driver or its annoying functionality. To top off the bad behavior, even although I immediately erased the entire contents of the drive, when it was plugged into another computer, it infected that computer with its unwanted drivers as well. I have thrown the key drive away to prevent further damage.
The combination of USB key drives and AutoRun is a serious computer virus infection vector to the extent that key drives are being banned in critical places. However the problem is not just with key drives. I have not disabled AutoRun because I use it two to three times a week to sync my iPod with the latest podcasts. Recently my daughter plugged her iPod into my computer just to recharge the battery. First this caused iTunes to crash, then when I brought it back, it wanted to sync my stuff onto her iPod. My daughter does not want anything of mine on her iPod and I had to jump through hoops to prevent the sync.
The problem is that iTunes and everyone else has totally bought in to the automagic nonsense of AutoRun behavior. A much simpler, safer and easier to use behavior is to have the user plug in a device and then bring up a program to use the device. Unfortunately the designers(?) of Windows decided to emasculate their users and instead give the device the power to decide what it wants to do. The subliminal message from Microsoft is that you are too stupid to operate you own computer so we are going do it for you, or let anyone else who might have more of a clue do it for you. The consequence of this design is that our computers do not belong to us, but to hackers who exploit these "features" as attack vectors to take control of them.
If you sit back and think about it, Autorun is obviously ill conceived. The design center is that a single user is logged into their computer and actively using it. What does AutoRun do when nobody has logged into the computer, what does it do when two users are logged in? In the example that I gave above, my daughter plugged her iPod into my computer when two people were logged in and the screen saver had locked both accounts. Of course iTunes crashed, it did not know what to do.
The iPod and iTunes is particularly annoying because it is unusable without AutoRun. On the iTunes support web site, the top support issue is "iPod doesn't appear in iTunes" and the second issue is "iPhone does not appear in iTunes". However there is no button in iTunes to go and look for an iPod or iPhone, instead they rely on AutoRun with no easy fall back should that fail.
AutoRun is the "feature" where you plug something into your computer and then stuff happens completely out of your control. The thing you plug in might be key drive, camera, iPod or whatever. Last Christmas I won a 4GB SanDisk Cruzer USB key drive as a door prize. When I plugged this horrible little thing into my computer it installed the U3 driver with useless and dangerous functions that I DO NOT WANT! To make matters worse, there is no obvious way to remove the driver or its annoying functionality. To top off the bad behavior, even although I immediately erased the entire contents of the drive, when it was plugged into another computer, it infected that computer with its unwanted drivers as well. I have thrown the key drive away to prevent further damage.
The combination of USB key drives and AutoRun is a serious computer virus infection vector to the extent that key drives are being banned in critical places. However the problem is not just with key drives. I have not disabled AutoRun because I use it two to three times a week to sync my iPod with the latest podcasts. Recently my daughter plugged her iPod into my computer just to recharge the battery. First this caused iTunes to crash, then when I brought it back, it wanted to sync my stuff onto her iPod. My daughter does not want anything of mine on her iPod and I had to jump through hoops to prevent the sync.
The problem is that iTunes and everyone else has totally bought in to the automagic nonsense of AutoRun behavior. A much simpler, safer and easier to use behavior is to have the user plug in a device and then bring up a program to use the device. Unfortunately the designers(?) of Windows decided to emasculate their users and instead give the device the power to decide what it wants to do. The subliminal message from Microsoft is that you are too stupid to operate you own computer so we are going do it for you, or let anyone else who might have more of a clue do it for you. The consequence of this design is that our computers do not belong to us, but to hackers who exploit these "features" as attack vectors to take control of them.
If you sit back and think about it, Autorun is obviously ill conceived. The design center is that a single user is logged into their computer and actively using it. What does AutoRun do when nobody has logged into the computer, what does it do when two users are logged in? In the example that I gave above, my daughter plugged her iPod into my computer when two people were logged in and the screen saver had locked both accounts. Of course iTunes crashed, it did not know what to do.
The iPod and iTunes is particularly annoying because it is unusable without AutoRun. On the iTunes support web site, the top support issue is "iPod doesn't appear in iTunes" and the second issue is "iPhone does not appear in iTunes". However there is no button in iTunes to go and look for an iPod or iPhone, instead they rely on AutoRun with no easy fall back should that fail.
Sunday, December 20, 2009
BI Megatrends: Directions for Business Intelligence in 2010
Every year David Stodder, Research Fellow with Ventana Research and editor-at-large with Intelligent Enterprise writes a column on Business Intelligence Megatrends for the next year. This column looks back at what has happened in the last year and what he expects to happen in the next year. This year David also presented his thoughts to the December meeting of the SDForum Business Intelligence SIG. David talked about many topics, here I will just cover what he said about the big players.
Two years ago there was a huge wave of consolidation in Business Intelligence when the major independent BI vendors were bought up by IBM, SAP and Oracle, who along with Microsoft are the major enterprise software vendors. In the last year SAP has integrated Business Objects with SAP software to the point that SAP is now ready to threaten Oracle.
Consolidation has not finished. In 2009, two important mergers were announced. Firstly IBM bought SPSS to round out its analytics capabilities. This move threatens SAS which is in the same market, however SAS is a larger and more successful company that SPSS, also SAS is a private company which means that it does not necessarily need to respond to the pressures to consolidate.
The other merger is Oracle's offer to buy Sun and the effect that has on Oracle's relationship with HP. HP and Sun are bitter rivals for enterprise hardware, and HP was the launch partner for Oracle Exadata, the high end Oracle database. Now Oracle is pushing Sun hardware with Exadata, leaving HP in the lurch. David pointed out that there are plenty of up and coming companies with scalable database systems for HP to buy up. That list includes Aster Data Systems, GreenPlum, Infobright, ParAccell and Vertica. Expect to see something happen in this area in 2010.
Of the three major database vendors, Microsoft has the weakest offering, despite SQL Server 2008. However Microsoft does have the advantage of the Excel spreadsheet which remains the most used BI reporting tool. A new version of Excel is due in 2010. Also Microsoft is making a determined push in the direction of collaboration tools with SharePoint. As we heard at the BI SIG November meeting, collaboration is an important new direction for enterprise software capabilities.
Two years ago there was a huge wave of consolidation in Business Intelligence when the major independent BI vendors were bought up by IBM, SAP and Oracle, who along with Microsoft are the major enterprise software vendors. In the last year SAP has integrated Business Objects with SAP software to the point that SAP is now ready to threaten Oracle.
Consolidation has not finished. In 2009, two important mergers were announced. Firstly IBM bought SPSS to round out its analytics capabilities. This move threatens SAS which is in the same market, however SAS is a larger and more successful company that SPSS, also SAS is a private company which means that it does not necessarily need to respond to the pressures to consolidate.
The other merger is Oracle's offer to buy Sun and the effect that has on Oracle's relationship with HP. HP and Sun are bitter rivals for enterprise hardware, and HP was the launch partner for Oracle Exadata, the high end Oracle database. Now Oracle is pushing Sun hardware with Exadata, leaving HP in the lurch. David pointed out that there are plenty of up and coming companies with scalable database systems for HP to buy up. That list includes Aster Data Systems, GreenPlum, Infobright, ParAccell and Vertica. Expect to see something happen in this area in 2010.
Of the three major database vendors, Microsoft has the weakest offering, despite SQL Server 2008. However Microsoft does have the advantage of the Excel spreadsheet which remains the most used BI reporting tool. A new version of Excel is due in 2010. Also Microsoft is making a determined push in the direction of collaboration tools with SharePoint. As we heard at the BI SIG November meeting, collaboration is an important new direction for enterprise software capabilities.
Thursday, December 17, 2009
A Systematic and Platform Independent Approach to Time and Synchronization
Managing time and synchronization in any software is complicated. Leon Starr, a leading proponent of building executable models in UML, talked about the issues of modeling time and synchronization to the December meeting of the SDForum SAM SIG. Leon has spoken to the SAM SIG previously on executable models. This time he brought along two partners to demonstrate how the the modeling technique can be applied to a broad range of problems.
Leon started the meeting by talking through five rules for handling time and synchronization. The first and most important rule is that there is no global clock. This models real systems which may consist of many independent entities and allows for the most flexible implementation of the model on a distributed system. In practice, other rules are a consequence of this first rule.
The next rule is that that the duration of a step is unknown. The rule does not imply that any step can take forever, its purpose is to say that you cannot make assumptions about how long a step may take. In particular, you cannot expect independent steps in the model to somehow interleave themselves in some magical way. The third rule is that busy objects are never interrupted. This forces the modeller to create a responsive system by building it from many small steps so that an object is always available to handle whatever conditions that it needs to handle.
The fourth rule is that signals are never lost. This is an interesting rule as it gets to an issue at the heart of building asynchronous systems. The rule implies that there is a handshake between sender and receiver. If the receiver is not ready, the sender may be held up waiting to deliver the signal. Perhaps the signal can be queued, but then there is the problem that the queue is not big enough to handle all the queued signals. In the end you have to build a system that can naturally handle all the events thrown at it, if it is a safety critical system, or that fails gracefully if it is not.
The fifth rule is that there is no implicit order in the system, except that if one object sends signals to another object, the signals arrive in the order that they were sent. Note that I may have interpolated some of my own experience into this discussion of the rules. If you want to explore further watch this video on You-Tube and go to Leon's web site which leads to many interesting papers and discussions.
Next at the meeting, Leland Starr, younger brother of Leon, talked about a web application that he had been the lead on for his employer, TD Ameritrade. The online application is for arranging participants in online webinars. By using the UML modelling technique, he created a model that could be both used to explain how the system would worked to the business sponsors of the project and that could be executed to check that it worked as expected. Leland has a SourceForge project for his work.
Finally Andrew Mangogna talked about a very different class of applications. He builds software to control implanted medical devices like heart pacemakers. The two overriding concerns are that the medical device performs its function safely and that it runs for at least 5 years on a single battery charge. Compared to many of the applications that we hear about at the SAM SIG the implantable device applications feel like a throwback to an earlier and simpler age of computing. The applications are written in the C programming language and the code typically occupy 3 to 4 kilobytes. The program data is statically allocated and an application can use from 150 bytes to 500 bytes. Andrew also has a project on SourceForge for his work.
Leon started the meeting by talking through five rules for handling time and synchronization. The first and most important rule is that there is no global clock. This models real systems which may consist of many independent entities and allows for the most flexible implementation of the model on a distributed system. In practice, other rules are a consequence of this first rule.
The next rule is that that the duration of a step is unknown. The rule does not imply that any step can take forever, its purpose is to say that you cannot make assumptions about how long a step may take. In particular, you cannot expect independent steps in the model to somehow interleave themselves in some magical way. The third rule is that busy objects are never interrupted. This forces the modeller to create a responsive system by building it from many small steps so that an object is always available to handle whatever conditions that it needs to handle.
The fourth rule is that signals are never lost. This is an interesting rule as it gets to an issue at the heart of building asynchronous systems. The rule implies that there is a handshake between sender and receiver. If the receiver is not ready, the sender may be held up waiting to deliver the signal. Perhaps the signal can be queued, but then there is the problem that the queue is not big enough to handle all the queued signals. In the end you have to build a system that can naturally handle all the events thrown at it, if it is a safety critical system, or that fails gracefully if it is not.
The fifth rule is that there is no implicit order in the system, except that if one object sends signals to another object, the signals arrive in the order that they were sent. Note that I may have interpolated some of my own experience into this discussion of the rules. If you want to explore further watch this video on You-Tube and go to Leon's web site which leads to many interesting papers and discussions.
Next at the meeting, Leland Starr, younger brother of Leon, talked about a web application that he had been the lead on for his employer, TD Ameritrade. The online application is for arranging participants in online webinars. By using the UML modelling technique, he created a model that could be both used to explain how the system would worked to the business sponsors of the project and that could be executed to check that it worked as expected. Leland has a SourceForge project for his work.
Finally Andrew Mangogna talked about a very different class of applications. He builds software to control implanted medical devices like heart pacemakers. The two overriding concerns are that the medical device performs its function safely and that it runs for at least 5 years on a single battery charge. Compared to many of the applications that we hear about at the SAM SIG the implantable device applications feel like a throwback to an earlier and simpler age of computing. The applications are written in the C programming language and the code typically occupy 3 to 4 kilobytes. The program data is statically allocated and an application can use from 150 bytes to 500 bytes. Andrew also has a project on SourceForge for his work.
Labels:
Open Source,
SDForum,
software engineering
Friday, December 04, 2009
Bandwidth Hogging
There are several discussions going on around the web about bandwidth hogging started by a post from Benoit Felten in the fiberevolution blog. I wrote about this issue last month in my post on net neutrality. The basic problem is that when the internet becomes congested the person who has created the most connections wins. Congestion can happen anywhere from your local head end through to a backbone and the backbone interconnects. Felten claims that there is no problem, and given the data, he is willing to do the data crunching to prove it, while others disagree.
The problem is a classic Tragedy of the Commons. There is a shared resource, the internet, and some people use more of it than others. That is fine provided that they do not interfere with each other and there is enough resource to go around. As I explained, the problem is that when there are not enough resources to go around, the people who win are the people who create a large number of connections, and these tend to be the people who use the most bandwidth. The point of a torrent client creating a large number of connections is to ensure that that the client gets its "share" of the net whether there is congestion or not. The only viable response is for everyone else to create large numbers of connections to do whatever they want to do, be it download a web page or make a internet phone call. This is undesirable because it can only lead to more congestion and less efficient use of the shared resource.
There are two parts to a solution. Firstly, the internet service providers have to keep adding more equipment to reduce congestion as internet usage grows. Everything would be fine if there were no congestion. Secondly, we need better algorithms to manage congestion. Penalizing people for using the bandwidth they were sold is not the answer, particularly when that is not the real problem. I have suggested that we should look towards limiting connections. Another thought is to kill the connections of the users with the largest numbers of connections to reduce congestion. Again, I am sure that this will have some unintended consequences.
The real problem is that unless we can all agree to be good internet citizens and get along, the forces against Net Neutrality may win. Then large companies with deeply vested interests will get to decide who has priority. The recently announced merger of Comcast, a large Internet Service Provider and NBC, a large content provider is exactly the sort of thing that we need to be wary of.
The problem is a classic Tragedy of the Commons. There is a shared resource, the internet, and some people use more of it than others. That is fine provided that they do not interfere with each other and there is enough resource to go around. As I explained, the problem is that when there are not enough resources to go around, the people who win are the people who create a large number of connections, and these tend to be the people who use the most bandwidth. The point of a torrent client creating a large number of connections is to ensure that that the client gets its "share" of the net whether there is congestion or not. The only viable response is for everyone else to create large numbers of connections to do whatever they want to do, be it download a web page or make a internet phone call. This is undesirable because it can only lead to more congestion and less efficient use of the shared resource.
There are two parts to a solution. Firstly, the internet service providers have to keep adding more equipment to reduce congestion as internet usage grows. Everything would be fine if there were no congestion. Secondly, we need better algorithms to manage congestion. Penalizing people for using the bandwidth they were sold is not the answer, particularly when that is not the real problem. I have suggested that we should look towards limiting connections. Another thought is to kill the connections of the users with the largest numbers of connections to reduce congestion. Again, I am sure that this will have some unintended consequences.
The real problem is that unless we can all agree to be good internet citizens and get along, the forces against Net Neutrality may win. Then large companies with deeply vested interests will get to decide who has priority. The recently announced merger of Comcast, a large Internet Service Provider and NBC, a large content provider is exactly the sort of thing that we need to be wary of.
Subscribe to:
Posts (Atom)