Many cloud providers will promise you high-speed hosting, together with all other features. That seems incredible, but there’s much more behind the speed of any hosting service, that sometimes users don’t even remember. Distance is something that seems meaningless in the internet era, but it does affect web hosting.
Yes, speed is one of the most important aspects when talking about cloud computing. But do you have any idea of what will influence the cloud speed?
- Bandwidth of your internet connection
- Number of other users and/or applications using the bandwidth on your local network
- Your contention ratio
- Distance from your local exchange.
- Throttling / traffic shaping by your ISP
- The load of the server you are connecting to
- Nature of storage device on your cloud infrastructure (traditional HDD vs SSD)
- The effectiveness of hypervisor on the host machine (Thanks to Salvatore Cordiano for pointing out this factor)
There’s still another important element that is very important to websites’ performance. We are talking about a metric that influences users independently of where they are. We are talking about latency.
In this post we will talk about latency and explain how it affects your website’s performance, which impacts your online business.
Bandwidth and Latency
In order to better explain the two concepts, we’ll use a metaphor of a highway. This is one of the best ways to explain the difference between bandwidth and latency.
What is Bandwidth?
That’s how we call your capacity of transferring data in one second. In the highway example, this would be the width of a highway. The more lanes it would have, the more cars would be able to pass per second.
The same can be said about the bandwidth: the larger it is, the more data will pass per second.
What is Latency?
Latency is what we call the time that a packet of data takes to move from its origin to its destination. In our example, this is the delay caused by stopping at a turnpike or another obstacle. The more turnpike lines and obstacles you face on the road, the greater the time delay in reaching the destination.
What causes latency?
To answer this question, imagine the situation in which you have the option of hosting a server in Amsterdam or in Miami, in the USA. Which one would you choose?
The best option would certainly be the one closest to your audience!
That’s because when a user wants to access your website, their computer sends and retrieves data at the speed of light. It uses the gateway nodes for that. It’s implicit that these routes face a processing delay.
So, the further the distance between user and server, the greater the latency will be.
Now that you know what bandwidth and latency are, you might be wondering about solutions to avoid latency and delay problems.
Especially when it comes to SSH access and cases involving frequent access of a database, the latency may be a deal breaker for many companies. It’s sure that a user won’t hire any services that can bring such problems.
The best solution would be to always host websites and applications the closest to the audience. Even though such delay can disturb some users, the target audience will always be the priority. That means, in this case, distance matters.
In this post, we have seen how distance can affect speed and other aspects of hosting. It can be tricky to be offered really small prices, but face deal breakers such as delays. That’s why we strongly recommend a lot of research before hiring any hosting service.
LetsCloud has servers in multiple locations. We decided to do that because we listened to many professionals that needed servers close to them, but couldn’t afford expensive resources. We also believe that being in different places will offer the best solutions to the public.
Do you have any problems regarding hosting long distance? Have you ever had? Tell us what solutions you found to this problem in the comments.