Thoughts of a multimedia madman

Tuesday, April 25, 2006

TVonics PRISM

I recently noticed an interesting digital TV receiver produced by a company local to me here in South Wales called TVonics. They've produced a small box called the TVO-STB 111 PRISM which looks too unlike a metallic Toblerone. For the fashion conscious it comes in a range of colours / effects including the ever popular pink (why does this colour sell?). The box has been available from Dixon's stores Currys in particular I believe under the Ferguson brand for about £70 but TVonics sell them direct via their own Amazon zShop in the UK for as little as £35.99 which is pretty competitively priced especially considering that it is manufactured by Sony. I've read some good reviews of it, apparently the teletext services on it are suppose to be quite fast and the picture quality appears to be highly rated. I don't have one of these boxes so I can't vouch for it myself but seeing as this box is British designed and built I'd recommend people consider it to the alternatives, after all if we don't support British technology companies we can't complain when we get boxes that aren't optimally designed for our UK TV market. I have difficult condition with digital reception in my area so I'd be interested in finding out if these boxes can offer better reception over the Vestal based boxes I've been using.

Thursday, April 20, 2006

Search engine update

Well it would appear that Google and other search engines are now finally indexing my site correctly, it’s amazing what a bit of bad HTML can do to ability of automated crawling services to read your feeds and index your site. From what I can make out Blogger automatically produces some bad XHTML and the compose mode would only seem to make this situation worse in some cases.

Tuesday, April 18, 2006

Search engine woes

I don't normally question the effectiveness of search engines, especially not Google but since starting this blog several months ago I haven't failed to notice that my blog is practically invisible to Google in terms of link references and content indexing. I find this very odd as my rather outdated website www.ssonic.co.uk has been well indexed by practically all search engines and is easily found but my blog was completely invisible and barely visible even after a promotional blitz. Initially I thought it was just a lack of interest but as I gained recognition from other search engines like Yahoo and MSN Search I noticed that Google was still ignoring me and I started to think why that is. At first I thought maybe Google was doing a bad job compared to other search engines who are trying to play catch up with their success, I even tried submitting my blog as a site and adding a sitemap but all that failed although Google BlogSearch can find me but only because it reads the atom feed. It seems strange, why would a Google owned blogging service hosting a blog on a Google owned server not be indexed by Google? I tried one last idea, I validated my website using the W3C validator and it generated 49 points in its report. From the report it would appear my Blogger account is generating crappy HTML/CSS which in turn makes it difficult for a browser or search engine crawler to parse which is rather silly seeing as all this code was generated and provided directly by Blogger itself.

Of course this theory for my lack of search engine indexing has not been validated yet, but seeing that no search engine is indexing my content I will be very interested to see if when I correct all or most of these mistakes whether this blog gets indexed properly. I'll update the blog if I have success and I'd warn any casual blogger that it might be worth checking to see how valid the HTML your site generates is.

See my current validation status here: http://validator.w3.org/check?uri=http%3A%2F%2Fmmediaman.blogspot.com

Thursday, April 13, 2006

Mobile operating systems are not commodities

An interesting article has appeared on the Symbian website refuting the often stated claim that mobile phone operating systems are a commodity and it doesn’t matter which OS they run. Linux phones come under fire for not being all the things people make them out to be, it’s easy to forget that Linux is only a kernel and it’s not particularly mobile focused. Mobile phones and other portable devices have to be optimised for low power usage in order to create viable products but power usage isn’t really an issue on a mains powered computer and even laptops still have fairly large powerful batteries which you simply don’t find in many portable devices. Because Linux is primarily a personal computer based kernel and will cater more for performance needs it is less likely to incorporate changes that will reduce power consumption at the price of performance.

The other significant point of the article is that the kernel is only 10% of the OS, the rest of which is made up of a very large stack of applications, middleware and other infrastructure. In the case of Linux phones much of that might be proprietary and not open source in any way, it also shows that core of the OS is only a small part of the picture. All in all it’s a well thought out article that despite coming from Symbian itself it lays out a well justified argument that the OS is very relevant in the mobile arena.

http://www.symbian.com/symbianos/insight/insight7.html

Wednesday, April 12, 2006

Could Haiku offer BeOS a future?

When Be closed it’s doors back in 2001 I thought the future for a promising operating system (OS) looked very bleak as without the creator and the main driving force behind the OS it would surely die within a year or two. It didn’t take long for the BeOS community to announce some Open Source efforts to recreate the OS from scratch or using bits of other OSes like Linux and though they seemed to be the logical step forward to save the OS from certain death it seemed a tall order to recreate 10 years of OS development. Many promising OSes have come and gone over the years, what would stop a BeOS clone suffering the same fate of grinding to a halt at some point?

The first problem I saw was fragmentation, at least 3 different projects appeared with the promise of recreating BeOS but OpenBeOS was the most prominent, this was later renamed Haiku for legal reasons. While other projects never seemed to make it off the ground Haiku did start to gain some traction by offering drop in replacements for various parts of the existing free release of BeOS R5. This seemed to be a smart move, recreating an OS from scratch is going to take a long time just as it took a long time for BeOS to build up traction originally. Fortunately Be had released a free version of BeOS before the company collapsed and this allowed the OS to live on legally if only in a frozen state of development. Because the original OS was stable and mature it allowed application and driver development to continue for those who were prepared to stand by their favourite OS though almost all major commercial application development under BeOS was terminated fairly soon after Be closed. The BeOS community was also given a boost by the appearance of a company called YellowTAB who developed a commercial version of BeOS called Zeta from where the original BeOS was left off just before the R6 release, this helped maintain interested although some BeOS fans were sceptical about the legitimacy and viability of Zeta.

The fact that the BeOS community has managed to stay alive and healthy in the 5 years since the original OS ceased development is its biggest asset because it has kept applications alive and driver development reasonably in line with current hardware. Other than Linux I don’t think any other free OS has this kind of support behind it. I’ve been following the developments of Haiku on the Weekly Haiku blog for quite a while and it seems the development of Haiku has reached a stage where the lower level efforts in recreating BeOS are starting to bear fruit. While Haiku is nowhere near complete or bug free I get the impression that some stability is starting to show through and some fairly big applications are starting to run, e.g. Quake 3 and VLC. This is excellent progress and hopefully the developers will stay committed to achieving their goal of recreating BeOS to where the original closed source R5 is now. The developers have taken the wise decision of not releasing anything half-finished to the public and thus keeping the progress and development fairly low key. The reason for this is to limit the publicity until the software is ready otherwise a lot of people will download a half finished OS that may not work well and leave a lasting bad impression of the OS. In many ways Haiku operates much like a commercial organisation would by keeping things relatively contained, keeping the team focused on the important tasks and staying quiet on the media front until they have something really big to announce.

When Haiku does reach its goal of releasing version 1.0 their software will fully replicate BeOS R5 providing a fully Open Source easy to use desktop focused operating system. It will be mostly suited to home users and small offices where complicated IT systems aren’t used since BeOS is largely single user and not designed for mass deployment in a managed environment. If Haiku can gain good driver support and a set of good quality free applications then I can see it displacing Linux as the default OS on ultra-budget PCs which choose to reduce costs by not bundling Microsoft Windows. I can also see power users using it as an alternative to Windows without the headache of managing Linux. It may also be promoted amongst less computer literate people who want computers simplified and do not care for fancy graphical interfaces like those promised by Windows Vista.

When Haiku does reach version 1.0 this will only be the beginning of a BeOS revival, much has changed since 2001 and there was still plenty of room for improvement back then. There are many advances that could be implemented to make Haiku a more competitive OS, here’s a few I’ve thought of:

Automated updating of the OS, drivers and other major components could be a key feature to reduce manual maintenance of the OS. Windows Update is a good example of an update system that works well.

Integrating the OS with community websites and resources such as the BeBits application repository could help make the process of getting help and applications much easier than leaving the user to seek these out themselves.

Ensuring there are plenty of no risk methods to evaluate the OS before jumping right in. Initiatives such as live CDs, virtual machine distributions, easy dual boot installations immediately come to mind. Virtual machines could help encourage users to use the OS from within Windows without the risk of hardware incompatibility while the OS builds up its native driver set to accomodate a larger variety of hardware.

Ensuring that the online experience is as equal to Windows and the Mac as possible, that involves having a state of the art web browser (Firefox is already the browser of choice on BeOS), online media playback, Flash, Java, etc.

Integrate support for cross platform development systems such as Java and .NET to help bring more applications to the platform.

Supply some well featured but simple general applications for common tasks such as word processing, e-mail, video editing, etc with Haiku so users can be productive straight after installation.

Whatever the developers choose to do once they reach their milestone it will be entirely up to the development team to decide where to go next and this is where Haiku will really make its mark. The success will depend on how radical the changes are, BeOS offers an excellent foundation to build on but the OS will require plenty of changes and there will always be the question of how far can you go without breaking too much and how far do you go to maintain compatibility. Will the developers want to keep the changes in line with what Be would have done or talked about doing or will the developers try their own things. The community have been thinking about the future beyond version 1.0 and they call it GlassElevator with much discussion of what should be in it, hopefully it won’t be too long before the community get a chance to implement it and we can find out if Haiku is the Open Source desktop OS everyone has been waiting for.

Monday, April 10, 2006

ASP.NET web wonder or web hell?

Recently I’ve been learning my third web development environment ASP.NET 2.0 having had prior experience in coding ColdFusion and then PHP. I knew ASP.NET would be different because I had heard it was more object orientated and like traditional Windows application development and Microsoft had developed quite a sophisticated development environment to go with it. I’ve always been a bit wary of ASP in its seemingly unpleasant merging of VB scripting and rigorous separation of code and HTML. In my experience I found ColdFusion to be very quick to get results with and easy to code for without any more assistance than the provided manuals most of the time.


When I started learning ASP.NET properly for the first time I was quite surprised at how quick it was to put together many of the basics easily from within the WYSIWYG interface. Master pages let you establish a consistent look very easily and very powerfully and a whole login architecture had been developed so you could literally drag and drop to get started. Wizards were a breeze to create compared to the custom system I spent weeks perfecting in ColdFusion. The honeymoon period didn’t last long however and once I started to serious task of putting together a web site in accordance to the specification I had I soon found that the very powerful and yet simple controls ASP.NET gives you could also prove to be it’s biggest problem. The login system was great if it didn’t require modification to suit your needs but almost all sites I’ve seen specify their own user management systems and the site I was working on was no exception. To keep things as simple and integrated as possible with ASP.NET I found I needed to write my own security providers which appeared to be a daunting task. Thought it didn’t prove to be as difficult as I thought it might be I was left unable to make it fit in completely so I’ve had to write mix my own custom pages together with their built in security mechanisms which feels a little bit uncomfortable and risky to me.


While I managed to get each small task done eventually it became very apparently that everything had to be done the Microsoft way or you would find yourself in a situation where all the power you were given would be sucked away slowly until you were left without much of the added power at all. For instance if you don’t like Microsoft’s security module and choose to completely bypass it without writing a security provider at all then you can’t take advantage of things like being able to protect certain users or types of user from accessing particular directories (e.g. a directory of admin scripts) automatically, instead you would have to code access checking and denying code into every single script. I started finding I was spending a lot of time reading up about the use of specific controls and sitting through basic examples of them in action but then when I tried using some of them I found that either they didn’t do what I wanted, looked awful or operated in an undesirable way. I also noticed when I started using CSS to try and style some of the stuff I had written that ASP.NET produces a lot of HTML code automatically for it’s controls in the form of tables and HTML 4 style formatting which is rather old fashioned and inflexible in a modern XHTML and CSS world. Microsoft say you can change the outputted HTML but it involves what appears to be a rather non-trivial exercise which quickly detracts from the simplicity that made ASP.NET appealing in the first place.


I got the impression with ASP.NET that a lot of small example projects were written in it during development which probably worked quite but I’m not sure it was thought about for larger sites with more specific requirements. A real web professional would probably be better off sticking to a traditional language like PHP or ColdFusion where the control is absolute (assuming they haven’t changed beyond recognition since I last used them) however I feel ASP.NET would be an excellent way for a hobbyist programmer to create a small custom online website quickly and easily.


I could be wrong on this and see the light, until then I feel that anyone who is thinking of using ASP.NET should think long and hard about what impact its development model will have on your project.

Friday, April 07, 2006

More on the digital radio music download trial

The BBC have now published a news story on the DAB music download trial announced yesterday, it offers slightly more information. The service will deliver music to your phone and send a copy to your home PC to put onto your iPod (though I suspect they mean other portable players). There is no mention to what music format and protection scheme is used but my guess is this might all be based on Windows Media as that is what was used for the 5.1 surround sound DAB trials so it is already an established format for DAB broadcasting.

I hope to do some investigations into the DAB IP services later this year to see how they work and what impact they have on the DAB platform.

BBC News story:

http://news.bbc.co.uk/1/hi/entertainment/4882926.stm

Thursday, April 06, 2006

Trials of downloading music via DAB radio announced

The ability to download music tracks via DAB digital radio could be coming to DAB multiplex near you this summer (if you live in the UK that is). Primarily designed for mobile phones this new facility will let you download music using the spare data channel bandwidth on what I assume will be the MXR multiplexes. Heart will be the station trialling the service and I assume the downloads will relate to music that is currently playing or is on the station’s playlist.

This all appears to be an expansion of BT Movio platform which this year concluded trials into TV over the DAB network using a system called DAB IP. I think this sort of thing is what digital radio is all about in terms of changing what radio can offer the listeners however the radio industry must be careful not to reduce the already dangerously low bitrates used for digital radio via DAB in the UK to accommodate these services. If this trend in bitrate reduction continues then fewer people will choose to listen to digital radio via DAB because of the terrible sound quality.

There is no mention as to how the music downloads are protected as potentially anyone with a PC based DAB receiver (quite a rare thing now) could extract these downloads for personal consumption. There is also no mention as to whether it would be to allow actual DAB radios as well as mobiles take advantage of this service. In the future if DAB is incorporated into MP3 players we could potentially have the ability to add music to our personal collection as we listen to it via the radio which might help make radio more appealing to the iPod generation.

Radio Today article: http://www.radiotoday.co.uk/modules.php?op=modload&name=News&file=article&sid=664

Thanks to Mike Barraclough of the Yahoo! DAB group for bringing this article to my attention.

Wow a useful online goverment service

You might not believe it but the UK government can sometimes put some really useful information online at least for a software development. I recently needed to validate a postcode and possibly some other information and a search on the web brought me to this part of the Cabinet Office website UK Government Data Standards Catalogue which contains a listing of many common data types used in the UK, along with UML diagrams, XML schema, validation and verification information which is contributed by various government departments. Some of these items even have regular expressions you can use to validate the data with contained with their XML schemas. This information can be extremely useful in helping to ensure there are fewer errors in data entry which I have noticed can be a frequent problem as I found when validating postcodes. When entering postcodes people will often put in an ‘O’ where there should be a zero and an ‘I’ where there should be a one. I think it pays to have good data and it’s much easier to fix mistakes earlier in the process rather than later when it might prove much more difficult to resolve the problems. At least my taxes have gone on something useful unlike some of the horribly inefficient construction work I’ve seen in Swansea’s SA1 area.

Wednesday, April 05, 2006

Predicting the future

Technology predictions are a funny thing, particularly when people start predicting more than say 5 to 10 years in the future. This crystal ball gazing can often be very misguided and either too advanced or not go far enough. Robots are a classic example of something that haven't ever had very accurate predictions because some people don't realise the technological complexities involved in making anything close to replacing human functions. The UK's telecoms giant BT has futurologists who try to predict how technology will changes things and Digital Lifestyles have published an article about some of their latest predictions here.

Reading through the article I'm not convinced by many of the predictions happening for a long, long time if at all. The most significant statement I can see is the recognition for the need to rein back on our energy usage because of the coming shortages of fossil fuels like oil which we still need for our plastics.

I'm all for making predictions myself, I love thinking up new ideas and trying to think how new technology might change our lives but I realise this has to be grounded firmly in reality particularly if you are trying to think of real products to develop. It's all too easy to see someone preach about a utopian future where technology solves all of mankind's problems and makes things better for everyone. Often these predictions sound very clinical and almost never mention the problems they bring. No one told us at the start of the Internet revolution that for all the good things the Internet brings it would also bring spam, viruses, scams, fraud, etc. Everything new will bring something good and something bad and you need to consider all new technology in balance, the telephone was a communications revolution but we have to put with telesales calls as a result. Another point to consider is just how much people hold on to the past, just because a technology exists it doesn't mean anyone will use it until they are either forced or compelled to take it up and this gets harder with age. With an aging population we could actually see the rate of technology adoption slow down if we can't convince people of the benefits or make it easier for people to pick up.

Take another classic misprediction, the fabled videophone. Videophones have been around since the 60's and since then we've been told numerous times that they are the future but they never came close to taking off for many reasons. Even with videophones finally gaining the mass adoption they never had with the advent of 3G phones and networks most people rarely if ever use the videophone aspect. Why? My guess would be because it is often unnecessary and uncomfortable to have to talk face to face to another person via the phone especially when you might be performing another task, your expressions and appearance are being judged or you simply feel uncomfortable being watched. Videophones do have their uses and they may become ubiquitous but just because you can have a video conversation, it doesn't mean you will choose to.

Monday, April 03, 2006

The Curse of Wireless Networking

The advent of wireless networking is suppose to have brought about a revolution in home networking, finally doing away with the need for all that specialised cabling trailing all around the house and the need to have hubs, switches, etc. It’s suppose to free your computer to live wherever you want it to and let you surf the web on your laptop in the garden.

 

The reality in my view is that we still have a long way to go and I’m not sure wireless is going to be a long term solution for home users. I have owned a 802.11g wireless router since first having my own broadband connection and while the wired side of it has served me well I have been less than impressed with the wireless service.

 

The challenge with wireless is getting a good signal and maintaining a continuous connection, to do this you need to ensure you are well within the wireless router’s range, the further you get away from it the weaker the signal will get but unless you live in a tiny apartment then you will probably have your computing devices spread out throughout your home. The signal issues really become obvious if you subscribe to the concept of the digital lifestyle like myself, while web surfing doesn’t demand much from your wireless connection the streaming of media content certainly does.

 

My Belkin router for whatever reason doesn’t like maintaining a connection no matter how good the signal strength for more than about 30 minutes without having a connection blip, these blips which don’t have an obvious cause became very apparent to me when I first attempted to watch a movie wirelessly by connecting my main computer upstairs to a laptop connected to the TV downstairs in a small house. The connection blip lead to the movie player getting suitably confused and requiring a restart and a repositioning of the film progress twice during a film. While the film was a low bitrate one, I did try a similar attempt at streaming digital TV which is far more demanding I wasn’t successful at all. When I later moved house to a location where my main computer was forced to rely on a wireless connection from the room upstairs I soon discovered even the most basic tasks of web browsing and instant messaging were easily upset by connection blips.

 

Another big problem I noticed is the security aspect, if you want easy configuration you need to leave your wireless connection open which then leaves it open for your neighbours or passers by to enjoy the fruits of your network and Internet connection. If you enable the security system then you are often left having to type in a long set of hexadecimal numbers twice every time you want to configure a computer to use the connection, not exactly a user friendly process. While some devices allow you to use a pass phrase to make the process easier, it’s not all that common an option so you normally have to resort the hex numbers anyway. For the average person there is a high chance they will not be aware or not be knowledgeable enough to set up the security without which is similar to leaving your front door unlocked, you are leaving yourself open to others accessing your personal files or abusing your Internet connection.

 

There is an alternative that hopes to marry the convenience of wireless with the consistency of wired connections and that is to use your home’s power cabling. These devices are still in their infancy but within the next two years I imagine they will be widely available at reasonable prices. The concept is simple, almost every device that needs a network is going to be plugged into a wall socket and all those wires meet up somewhere so you’ve already got a medium that can potentially carry the data. All that’s needed are some devices to transmit and receive the data over your power cables, and these devices are now on the market at less than £100 a pair. OK so it’s a bit pricy, the units are a bit bulky and with typical speeds of around 14Mbp they aren’t even a quarter the speed of your typical 100Mbps wired network but these are still early days. Within the last year the HomePlug AV standard was approved which promises to offer speeds of up to 200Mbps which could easily serve the networking bandwidth of the typical home for several years yet. It may also lead to your various devices around the home being able to talk to each other e.g. your alarm clock could turn on your TV and your AV equipment could carry data between devices without the need for all those messy wires. Over time the bulky adapters will hopefully get integrated directly into computing devices and appliances to make the setup easier and look more aesthetically appealing.

 

Power line networking won’t replace wireless as you will still need wireless networking for laptops, PDAs, mobile handsets and other networked battery operated devices, however it may be that wireless becomes a complementary form of networking. In built up areas there are big problems with wireless networks reaching saturation levels where there are so many other wireless networks around that there isn’t room for any more, also there are big problems with interference. Many people also complain that while wireless networking can work through walls and floors it isn’t always very good at it so power line technology appears to be a promising way forward.