Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


quentinreade

351 posts

Ultimate Geek

Trusted
2degrees

#128675 19-Aug-2013 13:34
Send private message

Hi all,
Just wanted to share the latest developments and improvements here with you all.

We have been working on massively extending capacity over the past few months.

We have just added 9GB of international capacity, and made a few changes (hardware and interface) - and I suspect many of you have already noticed the differences.

This has been a huge body of work, and we are working hard to make our service as fast as possible.

We still have some work to do – YouTube (and other Google products) are a bit slow at times at the moment, but they'll be getting a 2.5GB boost in September. The most obvious improvement will be for YouTube, and we expect it to improve dramatically.

We will then have loads of capacity, but it's not the only aspect to performance, so (post September) we'll be keen to hear if there are services you use that aren't running as well as you expect.

We'll be paying attention to Spotify, BBC iPlayer, Netflix, and the other likely suspects, but if there is something outside this that you are using, just let us know.

We will be tuning in the meantime, but until we have finished adding capacity, tuning we do today will get quickly undone by new links.
The internet finds a way to all destinations, but sometimes this is via a convoluted path, our calling is to make these as direct as possible, this is a bit of a black art, but we have some clever folks up to the task.

We are also looking at new caches, updates to existing ones and compression technology (for streaming video sites to make them stream more often in HD with less jitter) and some innovative solutions to P2P download speeds.

In short, we are dedicated to making our network the best it can be.

We are getting some positive feedback, and it's a real fun project for the team here.

We have completed around 75% of the improvements today, with more coming over the next three months.

As always, we are keen to get your feedback, and find out if there is more we can do.

Cheers




Comms chap

 

2degrees


View this topic in a long page with up to 500 replies per page Create new topic
 1 | 2 | 3
linw
2849 posts

Uber Geek


  #881000 19-Aug-2013 13:38
Send private message

Great news, Quentin, and thank you for this update. Really good to be kept in the picture.

Yep, Youtube has been scratchy at times!!



xpd

xpd
Geek @ Coastguard NZ
13765 posts

Uber Geek

Retired Mod
ID Verified
Trusted
Lifetime subscriber

  #881006 19-Aug-2013 13:49
Send private message

Cheers Quentin, appreciate the update :)




       Gavin / xpd / FastRaccoon / Geek of Coastguard New Zealand

 

                      LinkTree

 

 

 


Screeb
698 posts

Ultimate Geek


  #881172 19-Aug-2013 19:32
Send private message

All sounds great, especially YouTube/Google (Maps, in particular - it's slow as molasses these days) and P2P speed boosts. Haven't noticed any difference yet though (for everything else - I know the stuff I mentioned isn't done yet).

However, one part caught my eye...

quentinreade:
compression technology (for streaming video sites to make them stream more often in HD with less jitter)


For the love of all that is unholy PLEASE DO NOT DO THIS! (unless it's optional). Never ever ever re-compress content. I don't care if anyone claims it's hardly noticeable. I will notice. Do not want. If I ask for some bits, give me the bits I asked for. Do not give me an approximation. The amount and type of compression is up to the content provider to decide, not you. Unless I can compress my bill with lossy compression too ;)

I know some people would like it, but just be aware that others do not.



mercutio
1392 posts

Uber Geek


  #881279 19-Aug-2013 22:19
Send private message

Screeb: All sounds great, especially YouTube/Google (Maps, in particular - it's slow as molasses these days) and P2P speed boosts. Haven't noticed any difference yet though (for everything else - I know the stuff I mentioned isn't done yet).

However, one part caught my eye...

quentinreade:
compression technology (for streaming video sites to make them stream more often in HD with less jitter)


For the love of all that is unholy PLEASE DO NOT DO THIS! (unless it's optional). Never ever ever re-compress content. I don't care if anyone claims it's hardly noticeable. I will notice. Do not want. If I ask for some bits, give me the bits I asked for. Do not give me an approximation. The amount and type of compression is up to the content provider to decide, not you. Unless I can compress my bill with lossy compression too ;)

I know some people would like it, but just be aware that others do not.


it may be deduplication which can work really well for general unencrypted videos and also for things like bittorrent if there's sufficient cache size.

as far as i can tell they want to improve performance rather than reduce your bill, but it may mean more consistent performance.

the main problem with things like caching is you need a lot of storage, and even things like youtube, akamai etc can still running into storage bottlenecks.

generally speaking though lossless compression is different from lossy compression, and should give an experience at least as good.

and with a rise in popularity of sites like twitch.tv there are lots of similar streams starting at different positions, but with duplicated content.  so for instance they could see the data in the US, then send it to NZ, do checksumming at both ends, and if the checksum matches instead of send the data send a reference to the checksum rather than the data itself.  which means that the checksum could match completely different data, and you get the wrong data for a block, so usually the checksum itself is quite large, and/or there's a check that the data is in fact the same. (which works if you store in both locations) 

i dunno if that's what their plan is, but it's not a bad way to go.  wikipedia on deduplication is at: http://en.wikipedia.org/wiki/Data_deduplication

i'm not quite sure how the common definition of compression for data changed from lossless to lossy.  maybe when people changed from gif to jpeg, from wav to mp3, etc.  but remember there's still zip, rar, exe's, games etc which are compressed lossless.

scorpiworld
192 posts

Master Geek


  #881311 20-Aug-2013 00:38
Send private message

It's working.
Early February was awful (and I am on fibre!) But every month since has been an improvement.
Keep up the good work. Posting on GZ is a great way to keep us informed, plus get good technical feedback. :)
Very happy to read Netflix performance is on the radar. It works very well, but sure rips through the 60Gb cap! (And that is on medium quality). Please make sure it continues to work.

Screeb
698 posts

Ultimate Geek


  #881880 20-Aug-2013 22:32
Send private message

mercutio:
it may be deduplication which can work really well for general unencrypted videos and also for things like bittorrent if there's sufficient cache size.

as far as i can tell they want to improve performance rather than reduce your bill, but it may mean more consistent performance.

the main problem with things like caching is you need a lot of storage, and even things like youtube, akamai etc can still running into storage bottlenecks.

generally speaking though lossless compression is different from lossy compression, and should give an experience at least as good.

and with a rise in popularity of sites like twitch.tv there are lots of similar streams starting at different positions, but with duplicated content.  so for instance they could see the data in the US, then send it to NZ, do checksumming at both ends, and if the checksum matches instead of send the data send a reference to the checksum rather than the data itself.  which means that the checksum could match completely different data, and you get the wrong data for a block, so usually the checksum itself is quite large, and/or there's a check that the data is in fact the same. (which works if you store in both locations) 

i dunno if that's what their plan is, but it's not a bad way to go.  wikipedia on deduplication is at: http://en.wikipedia.org/wiki/Data_deduplication

i'm not quite sure how the common definition of compression for data changed from lossless to lossy.  maybe when people changed from gif to jpeg, from wav to mp3, etc.  but remember there's still zip, rar, exe's, games etc which are compressed lossless.


I'm not talking about "deduplication". That has nothing to do with compression, and there's nothing wrong with it.

There's no chance at all that they're talking about lossless compression. No video streaming site streams uncompressed content. That would be insane. At the very least you might find one or two niche ones that serve losslessly compressed video, but that's pretty much non-existent in the grand scheme of things. Either way, whether the content is losslessly or "lossily" compressed to begin with, re-compressing with lossless compression is going to gain you (on average) nothing at all (and in the best case, hardly anything - certainly not worth the extra processing power required to do it), unless the content provider had used some bad lossless compression to begin with. Extremely unlikely. And if it was originally lossily compressed, then re-compressing it with lossless compression is either impossible (because any standard video format can only have one level of compression - ie you can't serve video that is effectively a zipped mp4/avi/mpeg/etc. Nothing will play it, and as mentioned, gains nothing), or will just result in a file much bigger than the original (because you've simply "undone" the space benefits of lossy compression (while keeping the artifacts) and replaced it with a losslessly compressed version).

So no, they are not talking about losslessly compressing streams. What they're talking about is taking your twitch.tv stream, and re-encoding it with more (lossy) compression. If that's not optional for the end user, then that's absolutely ridiculous.

mercutio
1392 posts

Uber Geek


  #881882 20-Aug-2013 22:39
Send private message

Screeb:
mercutio:
it may be deduplication which can work really well for general unencrypted videos and also for things like bittorrent if there's sufficient cache size.

as far as i can tell they want to improve performance rather than reduce your bill, but it may mean more consistent performance.

the main problem with things like caching is you need a lot of storage, and even things like youtube, akamai etc can still running into storage bottlenecks.

generally speaking though lossless compression is different from lossy compression, and should give an experience at least as good.

and with a rise in popularity of sites like twitch.tv there are lots of similar streams starting at different positions, but with duplicated content.  so for instance they could see the data in the US, then send it to NZ, do checksumming at both ends, and if the checksum matches instead of send the data send a reference to the checksum rather than the data itself.  which means that the checksum could match completely different data, and you get the wrong data for a block, so usually the checksum itself is quite large, and/or there's a check that the data is in fact the same. (which works if you store in both locations) 

i dunno if that's what their plan is, but it's not a bad way to go.  wikipedia on deduplication is at: http://en.wikipedia.org/wiki/Data_deduplication

i'm not quite sure how the common definition of compression for data changed from lossless to lossy.  maybe when people changed from gif to jpeg, from wav to mp3, etc.  but remember there's still zip, rar, exe's, games etc which are compressed lossless.


I'm not talking about "deduplication". That has nothing to do with compression, and there's nothing wrong with it.

There's no chance at all that they're talking about lossless compression. No video streaming site streams uncompressed content. That would be insane. At the very least you might find one or two niche ones that serve losslessly compressed video, but that's pretty much non-existent in the grand scheme of things. Either way, whether the content is losslessly or "lossily" compressed to begin with, re-compressing with lossless compression is going to gain you (on average) nothing at all (and in the best case, hardly anything - certainly not worth the extra processing power required to do it), unless the content provider had used some bad lossless compression to begin with. Extremely unlikely. And if it was originally lossily compressed, then re-compressing it with lossless compression is either impossible (because any standard video format can only have one level of compression - ie you can't serve video that is effectively a zipped mp4/avi/mpeg/etc. Nothing will play it, and as mentioned, gains nothing), or will just result in a file much bigger than the original (because you've simply "undone" the space benefits of lossy compression (while keeping the artifacts) and replaced it with a losslessly compressed version).

So no, they are not talking about losslessly compressing streams. What they're talking about is taking your twitch.tv stream, and re-encoding it with more (lossy) compression. If that's not optional for the end user, then that's absolutely ridiculous.



i don't nkow where you get the idea that they're going to re-encode, which i'm much more hesitant to call compression than deduplication.   compression normally includes elements of deduplication. 

from:
http://en.wikipedia.org/wiki/Data_deduplication
"In computing, data deduplication is a specialized data compression technique for eliminating duplicate copies of repeating data."

that sounds to me like a form of compression.

the thing is if you have a video of:

abcdefghijklmnopqrstuvwxyz

and one user starts watching the feed before the video starts and gets the whole lot

then another use starts watching at m .. and wathes through to z.. the same data is going to be sent to both users.


and so using deduplication really helps when multiple users watch the same stream simultaneously, be it the olympics, bbc iplayer, twitch.tv, or youtube.

youtube has their own caching solution, bbc iplayer uses streaming servers that don't have NZ nodes, and twitch.tv use their own servers as far as i know.

actually twitch.tv is one of the worst offenders of re-encoding.  you can actually get worse performance watching in 480p than watching "original", as for some reason their re-encoding seems to often have synchronisation issues.

basically with compression sometimes you need to be smart with where blocks of data is for and pick up on that, and have a really huge dictionary, and then youll find relevant data.  in order to use a really large directionary you need to use some kind of hash or such with a bit of smarts lilke looking for key frames to reduce cpu usage for where blocks start for comparison.

deduplication is a form of lossless compression.  and basically you just need to stretch the compression over more than one stream to really reap the benefits. and do such work as to minimise the resource usage of such.

 
 
 

Cloud spending continues to surge globally, but most organisations haven’t made the changes necessary to maximise the value and cost-efficiency benefits of their cloud investments. Download the whitepaper From Overspend to Advantage now.
quentinreade

351 posts

Ultimate Geek

Trusted
2degrees

  #881992 21-Aug-2013 09:28
Send private message

Hi guys,
It's an interesting debate. 
We will look at a range of  technology and test it, keep it if it does the trick and discard if it doesn't.
Our goal is to make the internet experience better - knock out the jitter and buffering, but retain the quality - and we'll try a few tricks to do this.
If you can give us a heads up on the services you'd like us to work on first, that would be a great steer.
Cheers!





Comms chap

 

2degrees


Screeb
698 posts

Ultimate Geek


  #882373 21-Aug-2013 19:41
Send private message

mercutio:
i don't nkow where you get the idea that they're going to re-encode, which i'm much more hesitant to call compression than deduplication.   compression normally includes elements of deduplication. 

from:
http://en.wikipedia.org/wiki/Data_deduplication
"In computing, data deduplication is a specialized data compression technique for eliminating duplicate copies of repeating data."

that sounds to me like a form of compression.

the thing is if you have a video of:

abcdefghijklmnopqrstuvwxyz

and one user starts watching the feed before the video starts and gets the whole lot

then another use starts watching at m .. and wathes through to z.. the same data is going to be sent to both users.


and so using deduplication really helps when multiple users watch the same stream simultaneously, be it the olympics, bbc iplayer, twitch.tv, or youtube.

youtube has their own caching solution, bbc iplayer uses streaming servers that don't have NZ nodes, and twitch.tv use their own servers as far as i know.

actually twitch.tv is one of the worst offenders of re-encoding.  you can actually get worse performance watching in 480p than watching "original", as for some reason their re-encoding seems to often have synchronisation issues.

basically with compression sometimes you need to be smart with where blocks of data is for and pick up on that, and have a really huge dictionary, and then youll find relevant data.  in order to use a really large directionary you need to use some kind of hash or such with a bit of smarts lilke looking for key frames to reduce cpu usage for where blocks start for comparison.

deduplication is a form of lossless compression.  and basically you just need to stretch the compression over more than one stream to really reap the benefits. and do such work as to minimise the resource usage of such.


When I hear "compression", I think "compression" not "deduplication". It may be considered a type of compression, and the same principal is indeed used in many compression schemes, but it's just one component. If they mean deduplication then they should say that, not use an ambiguous word like compression. Like I said, there's nothing wrong with deduplication - if that is indeed what they are considering, then I have no qualms, so there's no need to try to "sell" me on deduplication.

All I'm saying is that if it's lossy compression that they are intending to perform (which to me is most likely), then I do not want it forced.

There's an example of re-(lossy-)compression of web content that already exists, albeit in a better form - Opera Mini, and Turbo Mode on regular Opera, routes everything through Opera's servers which lossily re-compresses images so that pages load faster. That's fine because it's optional (and makes sense if you're on a 3G connection visiting non-mobile optimised sites). My concern is that ISPs may adopt this approach with no option for the end user to disable it.

MadEngineer
4271 posts

Uber Geek

Trusted

  #882965 22-Aug-2013 18:13
Send private message

Any chance of an opt-in proxy cache?




You're not on Atlantis anymore, Duncan Idaho.

MadEngineer
4271 posts

Uber Geek

Trusted

  #883556 23-Aug-2013 19:12
Send private message

seems like something has worked ... certainly noticed an improvement around 10pm last night




You're not on Atlantis anymore, Duncan Idaho.

quentinreade

351 posts

Ultimate Geek

Trusted
2degrees

  #883566 23-Aug-2013 19:36
Send private message

^^ good, and you should notice more soon. :)

We are exploring some pretty smart proxy options too, but more about that later ... GZ will be the first to know (as always)




Comms chap

 

2degrees


mercutio
1392 posts

Uber Geek


  #883570 23-Aug-2013 19:53
Send private message

quentinreade: ^^ good, and you should notice more soon. :)

We are exploring some pretty smart proxy options too, but more about that later ... GZ will be the first to know (as always)


any plans to optimise interactive things like games, ssh etc? 

atm can be hit and miss depending if takes short path or long path.


mercutio
1392 posts

Uber Geek


  #883571 23-Aug-2013 19:53
Send private message

quentinreade: ^^ good, and you should notice more soon. :)

We are exploring some pretty smart proxy options too, but more about that later ... GZ will be the first to know (as always)


any plans to optimise interactive things like games, ssh etc? 

atm can be hit and miss depending if takes short path or long path.


quentinreade

351 posts

Ultimate Geek

Trusted
2degrees

  #883572 23-Aug-2013 19:56
Send private message

basically, yeah, we do. but if you and other GZers can let us know which ones in particular, then we can put those top of the list... 




Comms chap

 

2degrees


 1 | 2 | 3
View this topic in a long page with up to 500 replies per page Create new topic





News and reviews »

Air New Zealand Starts AI adoption with OpenAI
Posted 24-Jul-2025 16:00


eero Pro 7 Review
Posted 23-Jul-2025 12:07


BeeStation Plus Review
Posted 21-Jul-2025 14:21


eero Unveils New Wi-Fi 7 Products in New Zealand
Posted 21-Jul-2025 00:01


WiZ Introduces HDMI Sync Box and other Light Devices
Posted 20-Jul-2025 17:32


RedShield Enhances DDoS and Bot Attack Protection
Posted 20-Jul-2025 17:26


Seagate Ships 30TB Drives
Posted 17-Jul-2025 11:24


Oclean AirPump A10 Water Flosser Review
Posted 13-Jul-2025 11:05


Samsung Galaxy Z Fold7: Raising the Bar for Smartphones
Posted 10-Jul-2025 02:01


Samsung Galaxy Z Flip7 Brings New Edge-To-Edge FlexWindow
Posted 10-Jul-2025 02:01


Epson Launches New AM-C550Z WorkForce Enterprise printer
Posted 9-Jul-2025 18:22


Samsung Releases Smart Monitor M9
Posted 9-Jul-2025 17:46


Nearly Half of Older Kiwis Still Write their Passwords on Paper
Posted 9-Jul-2025 08:42


D-Link 4G+ Cat6 Wi-Fi 6 DWR-933M Mobile Hotspot Review
Posted 1-Jul-2025 11:34


Oppo A5 Series Launches With New Levels of Durability
Posted 30-Jun-2025 10:15









Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.