How to Optimize Your Website Speed by Improving the Backend
- 9303 views
Website speed matters. According to Google, the faster your website loads, the higher the position you’ll achieve in search results and the higher your conversion rates will be. That’s why website owners care about website speed optimization. In one of our previous articles we talked about tools that measure your website performance and tips to increase the speed of website loading by optimizing your website’s frontend. In this article, we’ll talk about back-end optimization. We’ll rank ways to speed up your website, showing you which optimizations have the most impact on load times. This means that we’ll start by addressing common issues like unoptimized queries that slow down a website’s performance and that are easy to identify and fix. We’ll follow up those easy fixes by talking about queries and database optimization, caching, web hosting solutions, and Content Delivery Network.
1. Mitigating the N+1 Query Problem
The “N+1” problem slows down many applications. This problem occurs when an excessive number of queries are issued to linked fields in a database instead of issuing one complex SQL query that includes all the records that need to be retrieved. In ActiveRecord ‒ the built-in object-relational mapping (ORM) tool in Rails ‒ the N+1 problem is solved with eager loading. With eager loading, all associated entities are requested with a single query instead of with multiple queries that slow down the application’s performance.
Moreover, we can use the Bullet gem to reduce the number of application queries. This gem tracks queries while an application is developed and notifies a developer when it’s necessary to add eager loading to beat the N+1 problem, as well as when it’s better not to use eager loading at all.
2. Database Optimization
Unoptimized databases can also slow down your website. To speed up the database, consider using indexes and normalizing and denormalizing the database. Below, we’ll provide you with examples of how to do this with relational databases.
Normalizing a Database
Even though normalization is a matter of course when designing a relational database, some developers neglect this procedure at times.
Database design (including the structure of tables and columns and the relations between them) touches upon the concept of normal forms, intended to optimize a database’s structure. Normal forms represent a linear set of rules applied to a database that work for database normalization. The aim of database normalization is to reduce and eliminate redundant data and ensure relevant data dependencies in order to avoid issues when inserting, updating, or deleting data in database fields.
All told, normalization helps you decrease the amount of space a database occupies and efficiently organize the data to improve database performance.
A common issue that slows down queries is indexes that aren’t used at all or are used improperly. Indexes in a database serve the same function as indexes in a book: each index element contains the name of the required object as well as an identifier showing its location. Indexes are created for database columns so that queries don’t need to go through all the columns to search for matching data; instead, the database only searches through indexes. However, keep in mind that indexes may slow down your database. Despite the fact that indexes speed up data returns, they slow down insertion, updates, and deletion of data.
Denormalization of a Database
Denormalization is the deliberate modification of a normalized database so it doesn’t comply with normal forms. The main goal of denormalization is to decrease the time required for select queries by adding redundant data like extra tables or attributes into existing tables to make data more accessible.
Database denormalization can help you address the following issues:
- Large number of table joins. We often need to join large numbers of tables in queries to a normalized database. While a table join is a resource-intensive operation, such queries use up server resources and take time to execute. To speed up these queries, consider denormalization by adding an extra field to one of the tables.
- Calculation values. As a rule, queries that perform complicated calculations slow down your database’s performance. If your database regularly performs complicated calculations, it makes sense to add additional columns to a table to hold frequently used and hard to calculate data. Creating a column that contains pre-computed values can save a significant amount of time during query execution. However, it also requires timely updating of data in this column.
- Long fields. If a database contains large tables that contain long fields like Blob and Long, you can speed up query processing by moving long fields to a separate table.
Caching is the process of storing data in cache and reloading it, so a web page isn’t rendered over and over again for each user. Caching enables users to work with a large amount of data in a short amount of time while using minimal server resources.
Caching can be implemented on the client side and on the server side. In our previous article, we talked about client-side caching (also called browser caching), which includes caching of images, HTTP headers, web pages, and so on. In this article, we’ll talk about server-side caching in more detail.
Server-side caching is the caching of data stored on the server; this data isn’t available to a client’s browser. While these are common caching mechanisms, each framework or CMS has its own out-of-the-box caching implementation for caching entire webpages, fragments of webpages, and database queries. Since the main technology we work with at RubyGarage is Ruby and Ruby on Rails, we’ll provide you with some practical examples of caching with Rails.
SQL caching in Rails caches the result of a select query. In situations where Rails encounters the same query request, it returns the cached result instead of querying the database again. In other words, when a repeated query is sent to the database, it in fact doesn’t reach the database. The first time the query is returned the result is stored in the query cache (memory), and the second time the query is returned it's retrieved directly from memory. But remember that the query cache is only stored temporarily, which is why you should opt for low-level caching to store query results longer term.
Fragment caching is the most widespread type of caching. With fragment caching, separate page blocks are cached. The caching of page blocks is useful for dynamic web applications, since their content is often updated and cached results may quickly become irrelevant. Moreover, complex web applications often contain many blocks and components. Therefore, any changes to web page blocks would require re-rendering the entire page and storing the page again, which makes caching of an entire web page inefficient. This is why caching separate fragments is the preferred way to improve your website performance.
Page Caching and Action Caching
Page caching was a default option in previous versions of Rails, but it has been removed from Rails 4. Now, page caching can be implemented with the actionpack-page_caching gem. Page caching in Rails is an effective type of caching that's done completely on the web server, without having to go through the Rails stack. This means that the web server returns cached static content without sending a request to the Rails application. In practice, this means that a web page is transmitted nearly instantaneously, which makes it one of the quickest methods of caching your content.
But remember that page caching isn't suitable for applications with frequently updated content like news feeds, because in that case cached query results would return content a user has seen before.
Also, page caching isn’t available on web pages with actions for authentication or error message generation. You can implement action caching, however, which is similar to page caching. The difference is that action caching hits the Rails stack, so it runs so-called “before actions” before the cache is served. You can use the actionpack-action_caching gem to enable action caching.
Benefits of Caching
Caching improves your website’s speed. First, it reduces database load time since results that were found once may be used repeatedly. Second, caching reduces application server load time because a once-cached web page can be transmitted to users multiple times. Finally, caching increases the speed at which the server can process user requests since you can just return a cached query result to the user immediately without having to query the database and render data.
4. Web Hosting Solutions
Web hosting companies own servers ‒ computers on which your web application, databases, and software are stored and run. Additionally, web hosts provide services such as backups and server management. Your website speed directly depends on the web hosting service you rely on, and web hosts have different capacities and scalability options. The most common types of web hosting include shared web hosting, virtual private servers (VPS), cloud hosting, and dedicated hosting.
Although shared web hosting is the cheapest and most popular solution, we won’t consider it in this article because it doesn’t provide enough capabilities to ensure a stable website and consistent speed. Instead, we’ll focus on VPS, cloud, and dedicated hosting and explain why you might choose one over the others.
Virtual Private Servers
A virtual private server is the golden mean between an extremely cheap shared web hosting package and expensive dedicated hosting. VPS providers offer a personal virtual server for each client, which is more configurable and scalable than shared hosting since your configurations don’t influence other clients (as they do with shared hosting).
Also, VPS prices are affordable, though they may go up if your website scales and requires additional services. A VPS is an optimal solution for websites with average traffic.Thanks to their scalability, virtual private servers are also great for ecommerce sites that expect traffic spikes during certain periods. Amazon Web Services (AWS) and DigitalOcean are examples of common cloud computing services that offer hosting, storage, computing, and other solutions for online businesses.
You might question if there’s any difference between a VPS and cloud hosting, as both are based on cloud infrastructure. They are, however, different: with cloud hosting, computing resources are spread across multiple servers. This makes cloud hosting highly scalable − you can increase or decrease the computing resources for your application depending on the load it faces. Many cloud hosting providers offer autoscaling, which means the platform automatically allocates more resources to your application in case of traffic spikes. Also, cloud hosting is reliable − computing resources are located across multiple physical servers, so if one fails, your application will keep running.
Pricing is another advantage of cloud hosting. Since computing resources are flexible, with cloud hosting you pay only for what you use. Providers offer both monthly and hourly rates and charge for additional resources your application might use. Therefore, with cloud hosting you don’t need to pay for resources that your website doesn't need. As a downside, you might end up receiving a hefty bill if your website experiences a prolonged traffic spike.
Cloud hosting is a great choice for small and medium-sized businesses that run applications with unpredictable traffic (which is common for ecommerce websites). Providers can offer cloud hosting plans tailored to specific types of applications. For example, if you need hosting for your ecommerce website running on the Magento platform, you can go for Magento web hosting that will help you better manage your application and enhance its performance.
Dedicated hosting provides you with a dedicated server – a physical server that belongs only to you. This is an expensive solution. First, you pay to rent a server (as a rule, prices start at $150 per month). Second, you need a system administrator to maintain and manage your server. However, with dedicated hosting you get resources and power that are all yours. Also, you can customize basically everything, from your operating system to the type of memory. Dedicated hosting is recommended for enterprise-grade websites and high-traffic websites where stability and speed are critical.
The Bottom Line
So what web hosting package should you choose? The answer is simple: consider your business needs.
Cloud hosting is an excellent choice for businesses that need a highly scalable, flexible, and reliable platform to host their applications on. At the same time, cloud hosting offers fewer management and configuration opportunities than a VPS or dedicated hosting.
A VPS is the optimal solution for small and medium-sized businesses with a decent number of website visitors. For example, popular web hosting provider Hostgator offers VPS plans that allow you to handle from 9,000 to 35,000 visitors per day, which makes a VPS a true competitor to dedicated hosting. In addition to providing a baseline of necessary computing resources (memory, storage, processing) to keep a full-fledged and stable website functioning, you can add as many additional resources as you want over time to scale your hosting as your business grows. This feature is useful during seasonal bursts of traffic when your website is hit by unexpected volumes of visitors.
VPS and dedicated hosting can each handle over a million visitors per month, which is why dedicated hosting is reasonable only in cases when you require greater flexibility to tweak your hosting environment and have the staff to maintain your own server.
5. Content Delivery Network (CDN)
A website’s page load speed depends on the location of its server: the closer the server is to the user, the faster the page loads. The main idea behind CDNs is to offer an advantageous geographical location, enabling end-users to download content much faster. By using a CDN service, you address the problems described below:
Points of presence (POPs) for CDNs are uniformly distributed to enable users to retrieve requested content faster. For example, if a visitor from Japan tries to retrieve content from a server based in the USA, the CDN will reduce latency by providing the data from a server based in Japan (as close to the the end-user as possible).
The more popular your website, the more people will try to access it. In order to ensure fast page loads for a website, you need to provide servers in multiple locations. However, handling and operating servers, especially across various locations, is quite costly. Many business owners decide that using a CDN provider is much cheaper and more convenient than building their own system of servers and maintaining them.
One of the advantages of a CDN is improved stability. If one server goes down, the user’s requests are automatically redirected to the closest available server, so a user doesn’t even notice a delay.
All businesses try to get as much traffic as possible. However, traffic spikes (sudden bursts of traffic) caused by holidays like Black Friday or simply successful marketing strategies can lead to poor website speeds or even errors. A single server might go down as a result of traffic spikes, but a traditional CDN like Akamai, Amazon CloudFront, and KeyCDN mitigates this problem by distributing the load.
Web page load speed is important for attracting and retaining customers. In general, we distinguish between frontend and backend techniques for improving your website speed. The backend, or the server part of your website, stays invisible to the end-user, but it matters a whole lot when it comes to your website’s speed. Some techniques discussed above can be implemented even by non-techy people, though some of them require specialists with a deep technical background. However, any and all efforts towards improving your website’s speed are worth the time and effort.
Subscribe via email and know it all first!