Web performance is interesting, if you like that nerdy stuff. More than half of people will abandon a website at initial startup at the 3 second latency mark. Most companies aim for initial load of 1s. After initial load, people will perceive elements of the site as ‘functional’ if they return in 200 ms or less. Traversing the web in the US from coast to coast is ~50 ms spent already, so actually doing something with the users data has to be very performant. Outside the US, especially in india-- the latency is pretty terrible. Because of connection maps through different vendors, sometimes you’ll have a situation where one half of a city has very fast times and the other half terrible. Same for mobile networks. Its absolutely the unregulated wild west out there and users dont see it. There’s also bad actors all over the place.
So, having both data centers and tons of mini data centers called POPS (points of presence) near large cities are how the internet stays fast. These POPS handle the initial connection between user and data center application, figure out some best steering of traffic sometimes, and sometimes serve some cacheable info.
Another thing that keeps sites fast are Content delivery services, or CDNs. CDN companies sell their services to other companies on the internet. Their value is in maintaining a high number of POPs and data centers to store your highly cacheable content that changes pretty seldomly, like your profile picture in LinkedIn or facebook. All CDNs do is serve up cached files quickly, using many data ceters and a large number of pops. So when you load up a web page, its fetching stuff not just from the company who owns the site, but from CDNs, and often third party websites as well like trackers, servers that provide fonts and pictures. Its a huge mess.
Video and voice is the new frontier of this stuff, and its more complication that sits on top of the previous complication. It demands low latency all the time, so a lot of it is deployed in POPs or via peer to peer connections when possible.
Depends on the data. Your family photos and home videos from the 90s you digitized doesn’t matter where it is. If it is financial data or a stock market exchange then speed and thus nearness play a big factor.
Isn’t the whole point of a data center that you don’t need to be near one to store your data?
Web performance is interesting, if you like that nerdy stuff. More than half of people will abandon a website at initial startup at the 3 second latency mark. Most companies aim for initial load of 1s. After initial load, people will perceive elements of the site as ‘functional’ if they return in 200 ms or less. Traversing the web in the US from coast to coast is ~50 ms spent already, so actually doing something with the users data has to be very performant. Outside the US, especially in india-- the latency is pretty terrible. Because of connection maps through different vendors, sometimes you’ll have a situation where one half of a city has very fast times and the other half terrible. Same for mobile networks. Its absolutely the unregulated wild west out there and users dont see it. There’s also bad actors all over the place.
So, having both data centers and tons of mini data centers called POPS (points of presence) near large cities are how the internet stays fast. These POPS handle the initial connection between user and data center application, figure out some best steering of traffic sometimes, and sometimes serve some cacheable info.
Another thing that keeps sites fast are Content delivery services, or CDNs. CDN companies sell their services to other companies on the internet. Their value is in maintaining a high number of POPs and data centers to store your highly cacheable content that changes pretty seldomly, like your profile picture in LinkedIn or facebook. All CDNs do is serve up cached files quickly, using many data ceters and a large number of pops. So when you load up a web page, its fetching stuff not just from the company who owns the site, but from CDNs, and often third party websites as well like trackers, servers that provide fonts and pictures. Its a huge mess.
Video and voice is the new frontier of this stuff, and its more complication that sits on top of the previous complication. It demands low latency all the time, so a lot of it is deployed in POPs or via peer to peer connections when possible.
Depends on the data. Your family photos and home videos from the 90s you digitized doesn’t matter where it is. If it is financial data or a stock market exchange then speed and thus nearness play a big factor.
Then I guess the only option is to shut down Las Vegas and never go back. I’d be fine with that.