In very simple words, Varnish is a program that both increases the speed of the website and reduces the load on the web server. The official website of Varnish provides this definition: “Varnish is a professional web server accelerator, also known as HTTP Reverse Proxy.”
If you pay attention to what the web server does in abnormal times, you will see that it takes HTTP requests and gives them HTTP-type responses. Ideally, when a web server receives a request, it responds immediately without doing any time-consuming work. But in reality, in most cases, the web server has to spend a significant amount of time processing the request and then sending the response to the client. Next, we’re going to explain how a typical web server responds to requests, and then show you how Varnish improves this situation.
History of Varnish
Table of Contents
Varnish was originally an idea used by a Norwegian newspaper, after which Paul Hening Kamp, who is generally known as a FreeBSD kernel developer, started developing Varnish. Today’s structure of Varnish is very different from when it was first introduced and he changed the service structure and also the type of service on Varnish.
This caching service was first introduced in 2006 with version 1 and later with versions 2 and 3 and finally in 2016 with version five.
The interesting thing about this tool is that Varnish has operated under the BSD License since the beginning and is available for free to all users, and you can easily use Varnish by installing it on your server.
How does cache varnish work?
The main configuration tool for Varnish is the Varnish Configuration Language or VCL, which is a Domain-Specific Language (DSL). This language is used to create routines that are called during the initial response to each request. Most of the configuration is done in the VCL code, making varnish more customizable and flexible than most other HTTP accelerators.
Various runtime parameters in Varnish keep things in check. Like maximum degree and minimum value for laborer threads, different timeouts, etc.
A charging line management interface allows these parameters to be changed and new VCL scripts can be ordered, created and activated without running the accelerator again. In order to reduce the number of frameworks that are brought to the memory from the fast path, the log information is placed in the shared memory.
The task of controlling, verifying, organizing and converting the log information to a page is assigned to another program.
Although each server has its own requirements, a typical web server performs a long series of actions to respond to each request it receives. This process usually starts with creating another routine to manage requests. At this point, it may be necessary to call the script records from the loaded cycle, an interface routine to decode and convert those documents to bytecode.
Then, the web server executes this bytecode, which may cause additional workload. Like running heavy SQL queries and retrieving more records than the work cycle. Now imagine this process repeated with hundreds or thousands of requests, now you can better understand how a server suddenly becomes overloaded and its framework tries to respond to all requests.
Worst of all, many requests are actually repeats of the original request, but the server may not have a way to recall its previous response to this request. Therefore, the server must repeat the same laborious process over and over again from the first step to respond to each request.
The parameters we talked about so far are all adjustable by Varnish. For example, we can choose to have a request retrieved by Varnish, not by the web server. Varnish then takes a look at what was requested and forwards the request to the web server (known as a backend server for Varnish). The backend server does its usual work and returns the result to Varnish, and finally Varnish delivers the result to the original user who requested it.
Most likely, this is not all that Varnish can do; Because so far it did not help the process much. What Varnish adds to this process is that Varnish is able to save backend server responses and reserve them for future use. Varnish can quickly use up its cache and respond to subsequent requests without putting unnecessary load on the web server.
✅ If we think a little about this great contribution of Varnish, we will come to the conclusion that it reduces the accumulation of requests, improves the response time and finally more requests are answered by the server per second.
What makes Varnish so incredibly fast is that it doesn’t store its reservation responses on the page, but instead stores them in memory. This, along with other improvements, allows Varnish to process requests at a significantly faster rate. That being said, since memory is usually more limited than the page, you should estimate the amount of space required for your varnish and take steps to ensure that requests that waste this useful space are not stored there.
Varnish uses both Round Robin algorithm and optional algorithm to set the load. It uses both for proper load sharing for each backend server. In addition, it is possible to check the health of the backend servers.
What else does Varnish support?
- Support for modules using varnish modules called VMOD.
- Support for Edge Side Incorporates.
- GZip compression and decompression support.
- Support for IP-based implementation solutions for clients, hashing, irregular and DNS.
- pass HTTP Gushing and get HTTP Gushing.
- Support for Steady Stockpiling test without changing URL.
- Support for Saint and Grace mods.
Who should use Varnish?
Everyone can use varnish; However, this accelerator is more useful for businesses that need to be fast and scalable and want to reduce the pressure on the server infrastructure.
For example, consider large online stores, where a delay of a few hundredths of a second causes them to lose a customer. Such businesses must use varnish.
Important points when setting up Varnish
The benefits and process of Varnish were explained in detail in Varnish Cash introduction article, and now there are some important points when using this service that you should know before starting and using it.
Note 1: The Varnish service will start serving by default after installation, but its default settings are not suitable for news websites or sites that publish content moment by moment, and it causes users to view your previous content. And until the cache is deleted again, new content will not be displayed.
Note 2: Varnish Cache is very flexible and will have the ability to optimize your site to such an extent that it works extremely fast.
CATEGORY:Blog