Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


304 posts

Ultimate Geek
+1 received by user: 58

Trusted
Vocus

Topic # 128675 19-Aug-2013 13:34
4 people support this post
Send private message

Hi all,
Just wanted to share the latest developments and improvements here with you all.

We have been working on massively extending capacity over the past few months.

We have just added 9GB of international capacity, and made a few changes (hardware and interface) - and I suspect many of you have already noticed the differences.

This has been a huge body of work, and we are working hard to make our service as fast as possible.

We still have some work to do – YouTube (and other Google products) are a bit slow at times at the moment, but they'll be getting a 2.5GB boost in September. The most obvious improvement will be for YouTube, and we expect it to improve dramatically.

We will then have loads of capacity, but it's not the only aspect to performance, so (post September) we'll be keen to hear if there are services you use that aren't running as well as you expect.

We'll be paying attention to Spotify, BBC iPlayer, Netflix, and the other likely suspects, but if there is something outside this that you are using, just let us know.

We will be tuning in the meantime, but until we have finished adding capacity, tuning we do today will get quickly undone by new links.
The internet finds a way to all destinations, but sometimes this is via a convoluted path, our calling is to make these as direct as possible, this is a bit of a black art, but we have some clever folks up to the task.

We are also looking at new caches, updates to existing ones and compression technology (for streaming video sites to make them stream more often in HD with less jitter) and some innovative solutions to P2P download speeds.

In short, we are dedicated to making our network the best it can be.

We are getting some positive feedback, and it's a real fun project for the team here.

We have completed around 75% of the improvements today, with more coming over the next three months.

As always, we are keen to get your feedback, and find out if there is more we can do.

Cheers




Head of Brand and Communications
Vocus NZ
[Slingshot, Orcon and Flip]


View this topic in a long page with up to 500 replies per page Create new topic
 1 | 2 | 3
1919 posts

Uber Geek
+1 received by user: 376

Subscriber

  Reply # 881000 19-Aug-2013 13:38
Send private message

Great news, Quentin, and thank you for this update. Really good to be kept in the picture.

Yep, Youtube has been scratchy at times!!

xpd

Chief Trash Bandit
9055 posts

Uber Geek
+1 received by user: 1407

Mod Emeritus
Trusted
Lifetime subscriber

  Reply # 881006 19-Aug-2013 13:49
Send private message

Cheers Quentin, appreciate the update :)




XPD / Gavin / DemiseNZ

 

For Free Games, Geekiness and Reviews, visit :

 

Home Of The Overrated Raccoons

 

Battlenet : XPD#11535    Origin/Steam/Epic/Uplay : xpdnz


671 posts

Ultimate Geek
+1 received by user: 10


  Reply # 881172 19-Aug-2013 19:32
3 people support this post
Send private message

All sounds great, especially YouTube/Google (Maps, in particular - it's slow as molasses these days) and P2P speed boosts. Haven't noticed any difference yet though (for everything else - I know the stuff I mentioned isn't done yet).

However, one part caught my eye...

quentinreade:
compression technology (for streaming video sites to make them stream more often in HD with less jitter)


For the love of all that is unholy PLEASE DO NOT DO THIS! (unless it's optional). Never ever ever re-compress content. I don't care if anyone claims it's hardly noticeable. I will notice. Do not want. If I ask for some bits, give me the bits I asked for. Do not give me an approximation. The amount and type of compression is up to the content provider to decide, not you. Unless I can compress my bill with lossy compression too ;)

I know some people would like it, but just be aware that others do not.

1387 posts

Uber Geek
+1 received by user: 134


  Reply # 881279 19-Aug-2013 22:19
Send private message

Screeb: All sounds great, especially YouTube/Google (Maps, in particular - it's slow as molasses these days) and P2P speed boosts. Haven't noticed any difference yet though (for everything else - I know the stuff I mentioned isn't done yet).

However, one part caught my eye...

quentinreade:
compression technology (for streaming video sites to make them stream more often in HD with less jitter)


For the love of all that is unholy PLEASE DO NOT DO THIS! (unless it's optional). Never ever ever re-compress content. I don't care if anyone claims it's hardly noticeable. I will notice. Do not want. If I ask for some bits, give me the bits I asked for. Do not give me an approximation. The amount and type of compression is up to the content provider to decide, not you. Unless I can compress my bill with lossy compression too ;)

I know some people would like it, but just be aware that others do not.


it may be deduplication which can work really well for general unencrypted videos and also for things like bittorrent if there's sufficient cache size.

as far as i can tell they want to improve performance rather than reduce your bill, but it may mean more consistent performance.

the main problem with things like caching is you need a lot of storage, and even things like youtube, akamai etc can still running into storage bottlenecks.

generally speaking though lossless compression is different from lossy compression, and should give an experience at least as good.

and with a rise in popularity of sites like twitch.tv there are lots of similar streams starting at different positions, but with duplicated content.  so for instance they could see the data in the US, then send it to NZ, do checksumming at both ends, and if the checksum matches instead of send the data send a reference to the checksum rather than the data itself.  which means that the checksum could match completely different data, and you get the wrong data for a block, so usually the checksum itself is quite large, and/or there's a check that the data is in fact the same. (which works if you store in both locations) 

i dunno if that's what their plan is, but it's not a bad way to go.  wikipedia on deduplication is at: http://en.wikipedia.org/wiki/Data_deduplication

i'm not quite sure how the common definition of compression for data changed from lossless to lossy.  maybe when people changed from gif to jpeg, from wav to mp3, etc.  but remember there's still zip, rar, exe's, games etc which are compressed lossless.

192 posts

Master Geek
+1 received by user: 3


  Reply # 881311 20-Aug-2013 00:38
Send private message

It's working.
Early February was awful (and I am on fibre!) But every month since has been an improvement.
Keep up the good work. Posting on GZ is a great way to keep us informed, plus get good technical feedback. :)
Very happy to read Netflix performance is on the radar. It works very well, but sure rips through the 60Gb cap! (And that is on medium quality). Please make sure it continues to work.

671 posts

Ultimate Geek
+1 received by user: 10


  Reply # 881880 20-Aug-2013 22:32
Send private message

mercutio:
it may be deduplication which can work really well for general unencrypted videos and also for things like bittorrent if there's sufficient cache size.

as far as i can tell they want to improve performance rather than reduce your bill, but it may mean more consistent performance.

the main problem with things like caching is you need a lot of storage, and even things like youtube, akamai etc can still running into storage bottlenecks.

generally speaking though lossless compression is different from lossy compression, and should give an experience at least as good.

and with a rise in popularity of sites like twitch.tv there are lots of similar streams starting at different positions, but with duplicated content.  so for instance they could see the data in the US, then send it to NZ, do checksumming at both ends, and if the checksum matches instead of send the data send a reference to the checksum rather than the data itself.  which means that the checksum could match completely different data, and you get the wrong data for a block, so usually the checksum itself is quite large, and/or there's a check that the data is in fact the same. (which works if you store in both locations) 

i dunno if that's what their plan is, but it's not a bad way to go.  wikipedia on deduplication is at: http://en.wikipedia.org/wiki/Data_deduplication

i'm not quite sure how the common definition of compression for data changed from lossless to lossy.  maybe when people changed from gif to jpeg, from wav to mp3, etc.  but remember there's still zip, rar, exe's, games etc which are compressed lossless.


I'm not talking about "deduplication". That has nothing to do with compression, and there's nothing wrong with it.

There's no chance at all that they're talking about lossless compression. No video streaming site streams uncompressed content. That would be insane. At the very least you might find one or two niche ones that serve losslessly compressed video, but that's pretty much non-existent in the grand scheme of things. Either way, whether the content is losslessly or "lossily" compressed to begin with, re-compressing with lossless compression is going to gain you (on average) nothing at all (and in the best case, hardly anything - certainly not worth the extra processing power required to do it), unless the content provider had used some bad lossless compression to begin with. Extremely unlikely. And if it was originally lossily compressed, then re-compressing it with lossless compression is either impossible (because any standard video format can only have one level of compression - ie you can't serve video that is effectively a zipped mp4/avi/mpeg/etc. Nothing will play it, and as mentioned, gains nothing), or will just result in a file much bigger than the original (because you've simply "undone" the space benefits of lossy compression (while keeping the artifacts) and replaced it with a losslessly compressed version).

So no, they are not talking about losslessly compressing streams. What they're talking about is taking your twitch.tv stream, and re-encoding it with more (lossy) compression. If that's not optional for the end user, then that's absolutely ridiculous.

1387 posts

Uber Geek
+1 received by user: 134


  Reply # 881882 20-Aug-2013 22:39
Send private message

Screeb:
mercutio:
it may be deduplication which can work really well for general unencrypted videos and also for things like bittorrent if there's sufficient cache size.

as far as i can tell they want to improve performance rather than reduce your bill, but it may mean more consistent performance.

the main problem with things like caching is you need a lot of storage, and even things like youtube, akamai etc can still running into storage bottlenecks.

generally speaking though lossless compression is different from lossy compression, and should give an experience at least as good.

and with a rise in popularity of sites like twitch.tv there are lots of similar streams starting at different positions, but with duplicated content.  so for instance they could see the data in the US, then send it to NZ, do checksumming at both ends, and if the checksum matches instead of send the data send a reference to the checksum rather than the data itself.  which means that the checksum could match completely different data, and you get the wrong data for a block, so usually the checksum itself is quite large, and/or there's a check that the data is in fact the same. (which works if you store in both locations) 

i dunno if that's what their plan is, but it's not a bad way to go.  wikipedia on deduplication is at: http://en.wikipedia.org/wiki/Data_deduplication

i'm not quite sure how the common definition of compression for data changed from lossless to lossy.  maybe when people changed from gif to jpeg, from wav to mp3, etc.  but remember there's still zip, rar, exe's, games etc which are compressed lossless.


I'm not talking about "deduplication". That has nothing to do with compression, and there's nothing wrong with it.

There's no chance at all that they're talking about lossless compression. No video streaming site streams uncompressed content. That would be insane. At the very least you might find one or two niche ones that serve losslessly compressed video, but that's pretty much non-existent in the grand scheme of things. Either way, whether the content is losslessly or "lossily" compressed to begin with, re-compressing with lossless compression is going to gain you (on average) nothing at all (and in the best case, hardly anything - certainly not worth the extra processing power required to do it), unless the content provider had used some bad lossless compression to begin with. Extremely unlikely. And if it was originally lossily compressed, then re-compressing it with lossless compression is either impossible (because any standard video format can only have one level of compression - ie you can't serve video that is effectively a zipped mp4/avi/mpeg/etc. Nothing will play it, and as mentioned, gains nothing), or will just result in a file much bigger than the original (because you've simply "undone" the space benefits of lossy compression (while keeping the artifacts) and replaced it with a losslessly compressed version).

So no, they are not talking about losslessly compressing streams. What they're talking about is taking your twitch.tv stream, and re-encoding it with more (lossy) compression. If that's not optional for the end user, then that's absolutely ridiculous.



i don't nkow where you get the idea that they're going to re-encode, which i'm much more hesitant to call compression than deduplication.   compression normally includes elements of deduplication. 

from:
http://en.wikipedia.org/wiki/Data_deduplication
"In computing, data deduplication is a specialized data compression technique for eliminating duplicate copies of repeating data."

that sounds to me like a form of compression.

the thing is if you have a video of:

abcdefghijklmnopqrstuvwxyz

and one user starts watching the feed before the video starts and gets the whole lot

then another use starts watching at m .. and wathes through to z.. the same data is going to be sent to both users.


and so using deduplication really helps when multiple users watch the same stream simultaneously, be it the olympics, bbc iplayer, twitch.tv, or youtube.

youtube has their own caching solution, bbc iplayer uses streaming servers that don't have NZ nodes, and twitch.tv use their own servers as far as i know.

actually twitch.tv is one of the worst offenders of re-encoding.  you can actually get worse performance watching in 480p than watching "original", as for some reason their re-encoding seems to often have synchronisation issues.

basically with compression sometimes you need to be smart with where blocks of data is for and pick up on that, and have a really huge dictionary, and then youll find relevant data.  in order to use a really large directionary you need to use some kind of hash or such with a bit of smarts lilke looking for key frames to reduce cpu usage for where blocks start for comparison.

deduplication is a form of lossless compression.  and basically you just need to stretch the compression over more than one stream to really reap the benefits. and do such work as to minimise the resource usage of such.



304 posts

Ultimate Geek
+1 received by user: 58

Trusted
Vocus

  Reply # 881992 21-Aug-2013 09:28
Send private message

Hi guys,
It's an interesting debate. 
We will look at a range of  technology and test it, keep it if it does the trick and discard if it doesn't.
Our goal is to make the internet experience better - knock out the jitter and buffering, but retain the quality - and we'll try a few tricks to do this.
If you can give us a heads up on the services you'd like us to work on first, that would be a great steer.
Cheers!





Head of Brand and Communications
Vocus NZ
[Slingshot, Orcon and Flip]


671 posts

Ultimate Geek
+1 received by user: 10


  Reply # 882373 21-Aug-2013 19:41
Send private message

mercutio:
i don't nkow where you get the idea that they're going to re-encode, which i'm much more hesitant to call compression than deduplication.   compression normally includes elements of deduplication. 

from:
http://en.wikipedia.org/wiki/Data_deduplication
"In computing, data deduplication is a specialized data compression technique for eliminating duplicate copies of repeating data."

that sounds to me like a form of compression.

the thing is if you have a video of:

abcdefghijklmnopqrstuvwxyz

and one user starts watching the feed before the video starts and gets the whole lot

then another use starts watching at m .. and wathes through to z.. the same data is going to be sent to both users.


and so using deduplication really helps when multiple users watch the same stream simultaneously, be it the olympics, bbc iplayer, twitch.tv, or youtube.

youtube has their own caching solution, bbc iplayer uses streaming servers that don't have NZ nodes, and twitch.tv use their own servers as far as i know.

actually twitch.tv is one of the worst offenders of re-encoding.  you can actually get worse performance watching in 480p than watching "original", as for some reason their re-encoding seems to often have synchronisation issues.

basically with compression sometimes you need to be smart with where blocks of data is for and pick up on that, and have a really huge dictionary, and then youll find relevant data.  in order to use a really large directionary you need to use some kind of hash or such with a bit of smarts lilke looking for key frames to reduce cpu usage for where blocks start for comparison.

deduplication is a form of lossless compression.  and basically you just need to stretch the compression over more than one stream to really reap the benefits. and do such work as to minimise the resource usage of such.


When I hear "compression", I think "compression" not "deduplication". It may be considered a type of compression, and the same principal is indeed used in many compression schemes, but it's just one component. If they mean deduplication then they should say that, not use an ambiguous word like compression. Like I said, there's nothing wrong with deduplication - if that is indeed what they are considering, then I have no qualms, so there's no need to try to "sell" me on deduplication.

All I'm saying is that if it's lossy compression that they are intending to perform (which to me is most likely), then I do not want it forced.

There's an example of re-(lossy-)compression of web content that already exists, albeit in a better form - Opera Mini, and Turbo Mode on regular Opera, routes everything through Opera's servers which lossily re-compresses images so that pages load faster. That's fine because it's optional (and makes sense if you're on a 3G connection visiting non-mobile optimised sites). My concern is that ISPs may adopt this approach with no option for the end user to disable it.

1640 posts

Uber Geek
+1 received by user: 419


  Reply # 882965 22-Aug-2013 18:13
Send private message

Any chance of an opt-in proxy cache?

1640 posts

Uber Geek
+1 received by user: 419


  Reply # 883556 23-Aug-2013 19:12
Send private message

seems like something has worked ... certainly noticed an improvement around 10pm last night



304 posts

Ultimate Geek
+1 received by user: 58

Trusted
Vocus

  Reply # 883566 23-Aug-2013 19:36
Send private message

^^ good, and you should notice more soon. :)

We are exploring some pretty smart proxy options too, but more about that later ... GZ will be the first to know (as always)




Head of Brand and Communications
Vocus NZ
[Slingshot, Orcon and Flip]


1387 posts

Uber Geek
+1 received by user: 134


  Reply # 883570 23-Aug-2013 19:53
Send private message

quentinreade: ^^ good, and you should notice more soon. :)

We are exploring some pretty smart proxy options too, but more about that later ... GZ will be the first to know (as always)


any plans to optimise interactive things like games, ssh etc? 

atm can be hit and miss depending if takes short path or long path.


1387 posts

Uber Geek
+1 received by user: 134


  Reply # 883571 23-Aug-2013 19:53
Send private message

quentinreade: ^^ good, and you should notice more soon. :)

We are exploring some pretty smart proxy options too, but more about that later ... GZ will be the first to know (as always)


any plans to optimise interactive things like games, ssh etc? 

atm can be hit and miss depending if takes short path or long path.




304 posts

Ultimate Geek
+1 received by user: 58

Trusted
Vocus

  Reply # 883572 23-Aug-2013 19:56
Send private message

basically, yeah, we do. but if you and other GZers can let us know which ones in particular, then we can put those top of the list... 




Head of Brand and Communications
Vocus NZ
[Slingshot, Orcon and Flip]


 1 | 2 | 3
View this topic in a long page with up to 500 replies per page Create new topic

Twitter »

Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.