How to deal with a LOT of traffic - Eporner Case Study
Here at Eporner we are serving about 500M pageviews each month. During high hours our infrastructure serves over 8.500 page requests per second and this NOT include static content like graphics and videos. Moreover we are always prepared to handle at least 3x the peak traffic. How we are achieving this ?
Infrastructure
Other websites are using ready to go cloud platforms such as Azure, Google Cloud or Amazon AWS. We have decided to manage our infrastructure on our own. We have very complex needs and cannot rely on managed solutions. With this in mind, our infrastructure architects designed and implemented all the necessary infrastructure from scratch. We have dozens of enterprise level servers and network equipment from which we have built everything fully adapted to our requirements. Let's start our journey from the same side from which your traffic goes to our network.
Load balancers
Users are expecting that our website will be available 24/7/365 to deliver best quality videos for them. To meet that requirement we have started from multiple load balancers. These boxes serve as the only endpoints of our infrastructure that are exposed to the public. Every user visiting our website connects with our load balancers. Load balancer take care of secure connection with your computer over HTTPS/TLS, read your request and direct it to the right internal server to process it further. We are using nginx as our load balancers due to their high performance(single box are able to handle over 500,000 requests/seconds), maturity (nginx is the web server that powers over 400 million websites) and open source which allow us to write for it custom modules to extend it's capabilities. Our load balancers support HTTP/2 and have specially optimized TCP stack to provide even better performance.
We have multiple load balancers to provide High Availability. Thanks to this solution, in the event of a failure of one of them, the others take over its traffic to ensure uninterrupted work.
So your traffic hit one of our load balancers and they directed it further.
Processing servers
Traffic from load balancers is directed to one of our processing servers. These boxes generate page that you have requested. Although our pages sent to end users are in HTML and JavaScript, we need to generate them first using data from our databases. We are using PHP for page generation. PHP is a popular scripting language that is especially suited to web development. It's extremely fast and scales very well. That is why it is used by most popular websites on the world like Facebook, Wikipedia and WordPress.
To generate your page, we first need data about videos and photos available. We store it in our databases. We have hundreds of gigabytes of data and we store it it multiple databases. For basic information we are using cluster build on MariaDB cluster with multiple replication nodes whith High Availability and fault tolerance. For storing larger datasets we use NOSQL database - MongoDB also set with multiple replication nodes and fault tolerance.
But that's not all. To ensure even greater performance we are using multiple levels of caching. Starting from Memcached and Redis daemons to cache small objects, ending on MongoDB to cache whole parts of webpages. Each part of our processing servers infrastructure is duplicated multiple times to protect from any kind of failure.
Ok, so we have already generated HTML of your page and are ready to send it to you. Our website is rich in videos, photos, mutiple scripts like video previews etc. This results in quite large code. To speed up loading and reduce amount of data you have to download, we are compressing each of our webpage first, before sending it to your browser. Most websites use gzip compression, but this is not enough for us. We always strive for perfection so we are using new brotli compression to keep amount of transferred data and load time even lower.
So we have delivered webpage code to your browser and now your browser starts rendering. To render webpage your browser need to download also all needed resources like images and videos. Our content servers enter here.
Content servers
Our content servers are boxes that physically contain image and video materials. Once again - here at Eporner we perfectionists, so we are using only NVMe and SSD based content servers to achieve highest performance. Our content servers are distributed in different places over the world to keep them close to you and as fast as possible. Each piece of data on our content servers is replicated in at least three different locations to guarantee fault tolerant and high speed. In case of any failure our system immediately redirect traffic to another node to keep website online. Our servers also support dynamic serving of WebP images instead of jpeg to provide even faster loading and smaller page size.
For best performance all our content servers have fully optimized TCP stack, are using HTTP/2 for fastest delivery and support TLS 1.3 with 0-RTT for high performance and security. All our content servers are connected with 10Gbps uplinks and we are using multiple collocation datacenters and network providers with total capacity of over 35Tb/s. This way we are always ready to handle your traffic.
Streaming servers
Streaming servers are a special case of content servers. They have all the advantages mentioned above and also greatest storage, up to 96TB of SSDs each and special software. We were the first porn tube website to support Dynamic Adaptive Streaming over HTTP (MPEG-DASH) in 2015. This solution allowed us to offer best video quality for every user regardless of whether he was using a cell phone, a slow connection or a super fast fiber. DASH allow video streaming to dynamically adapt (unlike HLS - independently for audio and video) to your connection speed to always deliver best quality.
For older devices we also support HTTP Live Streaming (HLS) which also support adaptive streaming but is less advanced. On the day of writing this article we are probably the only website in Porn industry to support both DASH and HLS streaming out of the box.
Using Adaptive Streaming allow us to serve same video in 4K / 2160p to users with broadband internet connection and also to cell phone users with 3G connection without buffering.
Eporner traffic in numbers
- Over 500M pageviews monthly.
- Over 8.500 requestes per second to our webpage servers during high hours.
- Over 150Gbps of traffic during normal day and ability to handle at least twice the amount immediately
- Over 3.000.000 videos
- Nearly 1.000.000 photos
- Nearly 1 Petabyte (1.000.000 GB) of storage
- All our infrastructure support IPv6, HTTP/2, TLS 1.3 with 0-RTT and brotli compression
Summary
I hope you have enjoyed the journey through our infrastructure. Everyday we are doing our best to guarantee you extraordinary porn experience. See you in the next part in which we'll discuss more specific cases about our performance.
Leon Fischer - Eporner IT manager and infrastructure architect