Well, the wild weather extremes over the last couple of months has certainly had me thinking.
For some time now, I've been working through in my mind how I expect to see NZ communications change over the next ten years, and what needs to be put into place by the service provider(s) to make that happen. While some of these points may seem amusing or absolute common sense to the lay observer, or geek, what needs to be recognised by the reader is that NOTHING is to be taken for granted, or that 'of course it will happen'.
Anyone who spends time writing business cases or convincing others around them of the importance of pursuing an action will be familiar with this. For example, to provide fibre for the purpose of faster networking seems common sense. After all, the technology is suited to it and has been proven in other countries. Of course the Government should provide it, after all its for the good of the country, right?
The sums involved are tiny in the great scheme of countries and time. Of course, for those who live in the here and now, that's money not available for anything else sensible or useful. Like upgrading basic power generation, transmission, distribution and storage.
Tonight, the power at home has been fluctuating wildly. My technology has been reset and spiked so many times I'm amazed nothing has shorted out. I am grateful the heating is still running - grateful that the bods at the electricity company are doing their best.
But what has become abundantly clear, once it has been taken away, is the absolute and unremitting reliance on Internet. Should the weather prevail tomorrow, I may not be at work - and neither will the people I work with. Meetings disrupted, critical information unavailable or severely restricted. Sure, my POTS line continues to work, but as I don't suffer an emergency every other day it's redundant unless I pair it with the Remote Office functions of my VOIP line - after all, people call me at my work number, not at home.
And before the mobile zealots jump in - where I live is utterly atrocious performance for Vodafone. I have lodged faults complaining of basic performance and coverage and been met with the immortal line of the disconnected, "coverage is the best we can make it, function of technology and topology, adverse terrain blah blah blah". Twaddle of course - had the techs ACTUALLY driven my route and surveyed it themselves, they would see that the network has lost it's tuning and is performing poorly. The devices I use are highly capable of running applications independent of the mobile network. 2degrees, vodafone, telecom - welcome to that world where you really are replaceable but for the specials you provide on DEVICES.
Even if it where running well - well, I don't want a microwave next to my head anymore. I have no need for it. I dislike using my DECT gear, but I prefer the low-power of that to the happiness that is a 3G mobile. Good for email and texting.
But where i work, it's the Internet. Always the Internet. The mobiles I use are internet connected - for email and applications. Using 3G is a painful, uninspiring chore. I'm pleased the widgets have gotten to the point of being so capable, and this is the first of my 2 points in this blog:
1. The device IS the experience
As we know the selection on the market is astronomic. Which one is the right one to use?
Tonight I had to get Skype going, for the very real actual purpose of communicating face to face. I gave up on my PC gear and used the Mac - and it worked first time, very well, even with a low-grade camera. My wife, many of my colleagues and acquaintance's - they do ask my view/opinion of gear, as 'someone' who will know and have a view - and they know I have a view.
But as my wife points out, there is no way she would talk to the boys in Dick Smith, Noel Leeming, Harvey Norman etc - the atmosphere is intimidating, the techniques for selling visible and shameful, the actual help in many cases just utter rubbish. It reminds of dodgy car salesmen at times.
Yet. in the UFB world, DEVICES are KING. In the UFB world, who cares really about who provides the POTS line or Internet? the fibre after all comes from one company. The weakest link in today's industry (Telecom Copper) will be the only game in town soon enough (TeleChorus Fibre). After that, the differentiator is internet and call plans, and the quality of customer service/service reliability. What's left, except the widgets with which you use your ufb?
Which then leads to the 2nd and last point.
An interesting point I heard from a Chorus chappie after the Feb 22nd earthquake was their absolute drive to restore Mobile for basic comms only, and Internet primarily - because most people wanted to visit the Civil Defence website, and NOT disrupt the work the humans were doing. Internet was more important - and many people struggled with the lack of power, as we know, for a great many basic necessities, like heating and communication.
Fibre is connected to your property at a termination point, then connected to a device referred to as a NID (Network Interface Device), NTU (Network Terminating Unit), IAD (Integrated Access Device), ONT (Optical Network Terminator) or other such acronym. They all do roughly the same thing: turn the light signals on fibre into electrical signals for use by your equipment. This device needs power - not a lot, but it does need it. And it needs to come from you, unless somebody has managed to convert the light to useful energy... So there is link 1, needing a feed.
Link 2: Highly likely you will have some unit from your service provider, which will supply you options of WIFI, VOICE, IPTV or similar content, Internet. whatever innovation emerges. Another feed point needing power.
As these 2 units provide critical services, they should get support power - both to protect against outage and provide respectable runtime - 3-7 hours in my view.
And that's it.
Both replacing the PSTN as a the base network, with Internet as the base upon which which all else is built. And that is where everyone must strive to succeed - securing the internet end to end, so that it all. just works.
I would argue that this doesn't just equate to Silicon Valley. Who's willing to play "plug in NZ companies into the chart below"?
Original weblink: PWC on Innovation
A friend of mine reposted this article on his LinkedIn profile, which is on Innovation, and is something I found quite interesting. I'm posting it here, because it holds very true for the ICT industry in New Zealand, especially now that the UFB project is about to get going (more on my views in a coming post).
The main points:
"Demystifying Innovation: take down the barriers to new growth," the drive for innovation must arise from the CEO and other executive leadership by creating a culture that is open to new ideas and systematic in its approach to their development. The innovation process generally has four phases:
- Discovery: Identifying and sourcing ideas and problems that are the basis for future innovation. Sources may include employees as well as customers, suppliers, partners and other external organisations.
- Incubation: Refining, developing and testing good ideas to see if they are technically feasible and make business sense.
- Acceleration: Establishing pilot programs to test commercial feasibility.
- Scale: Integrating the innovation into the company; commercialisation and mass marketing.
The study also identifies seven misconceptions about the innovation process:
- Innovation can be delegated. Not so. The drive to innovate begins at the top. If the CEO doesn't protect and reward the process, it will fail.
- Middle Management is the ally of innovation. Managers are not natural champions of innovation. They to reject new ideas in favor of efficiency.
- Innovative people work for the money. Establishing a culture that embeds innovation in the organisation will attract and retain creative talent.
- Innovation is a lucky accident. Successful innovation most often results from a disciplined process that sorts through many ideas.
- The more open the innovation process, the less disciplined. Advances in collaborative tools, like social networking, are accelerating open innovation.
- Businesses know how much innovation they need. Leaders must calculate their potential for inorganic growth to determine their need to innovate.
- Innovation can't be measured. Leadership needs to identify its ROII--Return on Innovation Investment.
ICT is a capital intensive business; that means LOTS of cash spent by companies is classified a certain way and can be depreciated over future years, much like any other asset can be. If I spend $1m developing a new product, it means I can take a charge to the business accounts over the life of the product, rather than realise all costs up front. This might sound boring and a little dry, but it's a fundamental tenet of how investment works and the behaviour it drives in a company. Put another way, if you spend $1m buying a business, you expect returns over the life of the investment, like any other investment. The more the better.
The interesting conundrum though is the last point; Innovation can't be measured. At least, not with significant accuracy in advance of the investment. Any investment carries risk, which can only be reduced by understanding more about the nature of the investment as well as the people making the promise.
The significance of the last point is that Innovation involves Research & Development - words that drive cold sweat into investment folk. Simple statements like 'Online Ordering', 'It all just works', 'It shouldn't be this hard' - well, Simple is difficult to engineer and takes a lot of effort. Folks have marvelled at how easy the Apple iPhone is to use - but prior to this, the industry threw GOBS of Innovation money at the concept. Apple did it better - and I bet they went down a lot of dead ends and wasted efforts in the process.
That's a hard business case to write - 'The estimate is $4m, but about 15-30% of the project involves stuff we've never done before'.
UFB has been pitched as $1.35bn public money investment, matched by at least equal private sector investment. The industry has thrown out estimates of $3-6bn of their investment over that time. Personally I believe it will be even more than this - but that is not a bad thing.
Innovation doesn't occur just in technology - it can and should happen with distribution, delivery, user experience, billing and so on.
But each change requires commitment and reason, and an element of risk. Some changes don't deliver new revenue - but they improve how a service is used and what customers experience.
One of the best I have seen is with 2degress, on their Pay Monthly plans:
I can set a billing threshold for my account, so I don't get billshock. For example, $250. At 80%, or $200, I get a warning text. At $250 my account gets suspended until I unlock it. It's a simple set and forget procedure, avoids opportunity to blow my bill (a BIG problem with Pay Later services), and gives me huge confidence to use it.
Where the innovation is required: unlocking it. I have to go online, or make a call to the call centre.
Why can't I just send a text back to a service number to say 'thanks for saving me, please unlock my account now'?
2. YouTube HD Video
3. The ever-present Microsoft and Apple patchs, regular and clockwork and flippin enormous every time
4. Virtual working (Citrix, VMWare and so on), due to the need for LOW latency.
I also found a useful extra which I thought were quite good:
Plays For Sure content. Over xmas, my kids got some DVD's they wanted to watch on dad's iPod. These DVD's came with the option to get a digital version that works across a number of widgets.
Each DVD has a unique 500-number key, but once entered correctly you get to DOWNLOAD a new file that gets deposited in your library (iTunes in my case). Each movie is high-qual, scales from iPod to 24" monitor without artifacting. and is 1.25GB in size.
In my previous article, I discussed 'Always On' - the concept whereby you can always get what you want, with blistering speed (http://www.geekzone.co.nz/antoniosk/7513). This was one of those times that speed mattered - and the movies just flew down. I have also downloaded a few hefty album CD's, which come replete with Video Singles - fantastic, beautifally encoded content that looks the biz. And boy does it burn the GB's.
I don't really care that this doesn't have 'GEEK' appeal; I am well capable of finding filched content like most people, but I choose not to, because the experience is just so poor - and to what end? I've got friends that try to get the latest movies which have been camcorded from the theatre and sent out on the torrents... oooo, now there's something I'd like to share, dodgy video with people coughing in the background. Fun.
It reminds me of watching the cricket at the basin by climbing the trees; sure, you got away without a ticket, but it was a pain in the bum (literally) and ultimately not that enjoyable.
100mb is not fibre. fibre is a technology that could be used to deliver high speed connections, of which internet is one possibility, but which also allows high-grade video, high quality voice, multiple call lines into a premises and so on. But fibre means new powered equipment in the premises, video-capable devices (do YOU see a camera on your TV?), new computers, and upgrades.
Yet on the whole, this is becoming more frequent. Mobiles turnover pretty fast, and they come with a huge range of built-in capability. My mobile is 4 years old (really), and if I ever get another I know it's replacement will be 10x better than what it can do now. It will be replaced when it finally dies, by necessity, like nearly all mobiles (and judging by performance, that's about 5 weeks away). My computer is also 4 years old - an eternity in technology lifecycle. The next generation of consoles - Playstation 4, Wii 2, Xbox 720, whatever - will all be wifi'd to an inch of their life, ready for high-speed internet in the home.
Yet we wring our hands over what a change in network technology will do. Therein lies the rub, and it's not the show-stopper people make it out to be. Sure, as a world we got used to having landlines that were powered from the exchange, meaning we could make a 111 call in a power outage. Many of these folks will also have DECT phones which need mains to run, and even more people in younger demographics go mobile only - battery powered. So what we actually got used to was ALWAYS ON; the comfort that came with knowing you could make an emergency call, should you need to. THAT is what needs to be worked on - not what can go wrong, but how we turn the change into opportunity, and just get on with it.
Thankfully, some companies are. Others are working towards getting on with it. But get on with it we should. Where there are services already, people should sell. The metro areas of the main cities I believe are pretty well served, even if many the telco's have a poor to abysmal public record of delivery. It is for those areas that don't have choice where there are lots of people that next energies should go: Greater Auckland and Waikato. Hawkes Bay certainly. Taranaki too. Manawatu seems to have some choice. Greater Canterbury certainly needs some now. Otago/Queenstown and Southland.
I read a great quote the other day:
Amateurs talk about making change. The achievers just get on and do it, day by day.
Way back when, these concepts were analysed in depth, at length, and serious money was spent verifying whether always-on was relevant, and would the public at large comprehend the whole MB charging concept???
At the time, the only application that was 'Always-on' was your voice and text messaging service. The voice and text 'app' were embedded in the phone, were a core part of how it worked, and were not considered an 'app' at all, as it just came with the phone.
I bring this up in terms of context for the High Speed Internet service I am using, on TelstraClear's cable network, in Wellington. The speed is running at 100mbps download and 10mbps upload, maximum. See earlier comments here http://www.geekzone.co.nz/blogentry.asp?postid=7494
A while ago I was asked what 100mbps was good for. And just like those early days of GPRS, I thought about finding an application or use case expression... and failed dismally, because that's not the way to view the opportunity.
The pace in the western world is accelerating. Information is more readily available, in more forms, quicker than ever before. Perhaps it is hard to digest. Or perhaps we just to expand how we use our brains and learn to filter more effectively, or listen to others and get their view. But, it's not going to slow down. Information will not decrease. Live with it.
So to quote an overused expression, we have to 'suck it up'.
And in that respect: i don't want to wait, and I don't want to compromise what I do get. A 15mbps connection on TelstraClear cable is pretty good. You can download an average quality YouTube clip in semi reasonable time.
But what brought 100mbps home for me, was watching my daughters explore YouTube and download High Definition content as the default, not the fallback. I hate grainy movies and poor quality audio - I don't have time for it. Huge downloads are a pain when your link is slow, and irrelevant when it only takes seconds.
'Glee' gets a good amount of airtime here. If you can tolerate the stageshow nature of the programme - I enjoy musicals, so no problem for me - the different is amazing (720 vs 360) when you upscale and go fullscreen, especially to a large TV.
It also veritably FLIES down, starting to exercise the Youtube cache that TelstraClear put in a little while ago.
Hardly stuff that's going to add another $100bn or so to the NZ economy. Parking the hyperbole, faster speed does lead to new experiences, and that's what it's about at the end of the day.
With speeds like this becoming widely available, paired with a high-qaulity wireless router (like an Airport Extreme, which I think works brilliantly), the 'concept' of always-on for wireless widget (ipods, smartphones, ipads) as well as streaming content to a TV from the web - well it's all just there. You STOP having to make a cup of tea wanting for the content.
You just get on with it.
More coming after the xmas break... have a good holiday, wonderful surfing and enjoy the sun....
Disclosure: I work for TelstraClear, in product development and strategy.
In marketing & management vernacular this would be the familiar terms of 'early adopter', 'leading edge' and 'pioneer'. I particularly like 'pioneer' - it conjures the image of a hard man in a strange place, almost alone, and making things work because they have to. The number 8 fencing wire myth of how New Zealand was made in particular resonates with the image. Ringing in my mind to this day though, is a quote I heard while studying at University, about why IBM were never pioneers in a technology.
The quip that came back was that 'pioneers were the one's with arrows in their ar*e', and that IBM chose to follow in the early footsteps of pioneers so they could make things 'go large' to use another term familiar to New Zealanders about success.
I like to think I'm a man of firsts. If not in carving out raw wilderness - my house has a wild enough section to keep me occupied for some time - then certainly in the area of technology and communication services. That's Mobile, VOIP, Internet, TVoverIP and so on, in common terms. And if not a pioneer - I look for help as much as the next person - then certainly someone focused on moving from the old to the new, in a very large way.
So the Governments' first announcements for UFB were interesting; Northpower and WEL. I worked on an early TCL project to use Northpower's fibre network, the first services of which went to market in November 2008. These guys are definitely focused on more fibre, so was an easy first win for the crown. The next was watching the announcements on bandwidth and the art of the possible, for residential and business customers. and the more mundane first products CFH has announced (30/10, 100/50 and 1/1Gbps), all with a min bandwidth of 2.5CIR.
I recently joined the 100/10 mb/s trial service that TCL is running, for those with access to the HFC network. I changed from the Lightspeed40g 15/2 package, which most HFC customers got in the price change implemented on October 1. The data cap is set at 120gb, and so far I have used. 6GB. Some weeks prior I was asked what 100mb is actually good for; what does it enable that the current speeds don't; and what are people likely to ask for? Being able to say 'I have tried; I have researched; I have discovered; I can comment' based on the real-world, rather than the lab, is invaluable. To use a sporting metaphor, it's easy to read the theory on playing football, but at some point you need to get in the field and kick the ball.
So first things first: getting connected, which was easy. I replaced the old Motorola standup surfboard modem with the new Cisco DPC3010, which is a lay flat, and quite tiny by comparison (15x14x3cm). It comes with 1 GigE WAN port, USB2 data port and of course the F-Connector to connect to the cable network. The unit is in an 'entertainment' cabinet but has about 20cm of ventilation above it - and it needs it. The heat from the unit is noticeable, like most Cisco gear I've ever used.
This unit is connected to a modern 802.11n wireless router. The router/switch equipment is HUGELY important when it comes to high speed internet - not least of which, the wireless device you use. The configuration of WIFI+Internet can't be ignored - and the way WIFI works doesn't easily matchup with how wired Internet works.
The main issue is error correction and speed. 802.11g router's are sold as "up to 54mbps", which is technically accurate. But this is 54mbps for the wireless link, and most of that bandwidth is chewed up in error correction - so you'll get about 20mbps clear to your computer by the time you're done.
802.11n increases this threshold to about 150mbps in the air - but of course, both device and access point need to be compatible, and you need to be sure they aren't too far apart. The further apart devices are, the weaker the signal, the greater the error correction and reprocessing. we haven't moved that far away from the basic principles of radio: poor signal = poor quality. Running a speedtest here, I get consistent reports of 90mbps wired, and between 30-50mbps over WIFI 802.11n.
So far I haven't said a word about what 100mb would be good for. When I was asked my opinion way back when, here's what I said:
1. Big-draw items, like iTunes, Skype HD Video, Torrent websites and other streaming media like Youtube or IPTV like Ziln, although pipe speed is just one factor
2. Point to multipoint video
3. Any work involving large file transfers (Microsoft Patch Tuesday anyone??)
4. Hosted work involving Citrix, VMWare and other machines within machines. Not because of the bandwidth, but because of improved latency - a 100mb connection will almost certainly operate with very low latency, on high-grunt infrastructure.
and of course the old stalwart of the technology industry. 'applications we've yet to imagine but for which 100mb will be great'. or 'build it and the apps will come'.
So what have I found?
1/ My iTunes does download content faster. Purchased music just sounds better to me - the audio levels are balanced, the albums are complete, and the format works brilliantly for my iPod. Of course, my 4-year old PC still takes an age to churn through what I've downloaded and present it to me - my 100mb internet hasn't made my computer any faster!
2/ Citrix and VMWare run a lot more snappily for me.
3/ The web runs as fast as it ever did, although Microsoft and Apple patchfiles do get delivered faster.
I'm keen to better see where this capability leads. A burst speed of 100mb in isolation is interesting but a little early - the Interweb's services are not scaled or dimensioned for a general population wanting to communicate at 100mb (more like 1mb). Sustained speed and latency would be intriguing - watching Apple movie trailers at 1080p was actually possible tonight (these files are around 200mb in size and take an age to download even on good quality low-speed).
When the plumbing layer gets to the point where the speed is not an issue. great, not before time. Moving to the next step - turning over solid, reliable and consistent services - now that will be a good move.
Comments welcome. I don't know where this technology will take us - but I'm interested to hear what others have to say.
As well as selling AAPT's consumer division, Telecom said it had sold AAPT's 18.2 per cent stake in iiNet to institutional investors for A$70 million, A$11 million less than its carrying value as at June 30.
Combined with the proceeds from its sale of 10.1 per cent of Macquarie Telecom announced yesterday, the deals will realise about A$140 million.
Telecom had reportedly been seeking more than $400 million for AAPT as a going concern.
It will now concentrate on running AAPT's fibre network and the wholesale and business divisions, it said.
The sale of the consumer unit will reduce 2011 forecast earnings for AAPT by A$10 million, Telecom said.
AAPT was expecting earnings of A$101.3 million for the year to June.
Telecom CEO, Paul Reynolds, said: "Together these transactions rationalise non-core assets, strengthen Telecom's financial position, and help reposition AAPT's operations into a focused, network-centric wholesale and corporate business that is well-positioned for future growth."
A Telecom spokesman said the company was now ''taking stock''.
''We're happy with the transactions we've made,'' he said.
''That's not to say if a good offer [for the rest of the business] was put in front of us we wouldn't look at it seriously. But having done these transactions, which we're pleased with, we'll take stock.''
The buyer of AAPT's consumer business, listed Australian telco iiNet, said it expected the acquisition to boost earnings by A$20 million in the first full year.
AAPT's 113,000 broadband subscribers and 251,000 other connections would bring its broadband customers to 652,000 and total active services to 1.3 million, it said.
iiNet will continue to buy wholesale services from AAPT.
The transaction requires approval of iiNet shareholders and an extraordinary general meeting is expected to be held in September.
My 500GB main disk died. No click of death. No warning from SMART. Nothing.
The disk had been running a little poorly for the last few months - an unfortunate fight between ACPI and APM meant it's XP partition was never the same again.
But this morning - nothing. Not even a parting goodbye.
The darn thing won't register. I doubt it's even spinning up.
Now, it wasn't completely unexpected.... I have a new 1TB drive with Vista Ultimate on it (and please don't start on why not Win7... I did not have $400 spare) that I was progressively moving to, at a speed the wife would accept.
So now we're on Vista, I have a dead HD I'm wondering to do with, and a lot of stuff to migrate very quickly.
But it got me thinking about the cloud, for the first time in a long time (especially given it's my job to have my head in the clouds).
I've lost no email. My precious media of the children is on a seperate HD (which is about to be backed up AGAIN!). But I would love to be able to have a safe store for what is important to me and my family.
With my TelstraClear Cable Internet, I can restore my email and any PC pretty easily - although getting Vista and the apps patched up again came to about 3GB in a day - but for real content? forget it. WHo can offer me 1TB of storage? and what Internet service can I use to upload that amount of info?
A home server might be the answer... but that also has a hard disk that will eventually die. On the story goes.
I've been struggling to think about what use a fast fibre network could be. This is one of the uses.
But then economic reality steps in... my wife says she'd pay $30/month to back our data. That's real world consumer expectation, and she doesn't care what's involved in making it happen.
SO who will be first with a 100Mbps Internet service and unmetered 1TB in the sky. $30 a month up for grabs.....
And streams _very_ well at HD on my cable connection at home...
Telstra has come to terms with NBN Co in a deal valued at $11 billion which will see the carrier decommission both copper and HFC telephone & broadband services.
Under the heads of agreement announced this afternoon, Telstra would provide access to Telstra facilities and progressively migrate Telstra traffic onto the National Broadband Network, subject to regulatory approval. The agreement for these terms will have an approximate value of $9 billion.
Separately, the Federal Government has agreed to progress “public policy reforms” with an attributed value of approximately $2 billion. These basically involve changes to Telstra’s current universal service obligations with the establishment of a new Commonwealth entity – USO Co – which will deliver unprofitable services. USO Co will receive a maximimum of $100m in annual taxpayer contributions with the rest to be funded by presumably increased industry contributions. It will take over Telstra’s USO obligations from 2011.
Telstra also said it has received a written agreement from the government that it will be able to participate in LTE spectrum auctions under the deal.
“This is a sound outcome for NBN Co because when finalised it can maximise the use of existing infrastructure and accelerate the roll out of its network,” NBN Co CEO Mike Quigley said in a press release.
NBN Co added that Telstra would likely become its largest customer. NBN Co will pay Telstra for migration of traffic on to the NBN and the decommissioning of its network.
The Heads of Agreement also provides for NBN Co’s use of Telstra’s “existing fit-for-use infrastructure, such as ducts, pits and conduit and a right to acquire Telstra backhaul services and space in Telstra exchanges. While there is a considerable amount of negotiation and contractual work to go, we believe this agreement is a significant step forward to creating a more competitive telecommunications industry,” Quigley said.
Telstra expects to be able to place the deal to a shareholder vote in the first half of next year. The deal is subject to both that vote and ACCC approval.