The Azure Marketplace is a software repository for pre-built and configured Azure resources from independent software vendors (ISVs). You will find open source and enterprise applications that have been certified and optimized to run on Azure.
NGINX, Inc. provides the latest release of NGINX Plus in the Azure Marketplace as a virtual machine (VM) image. NGINX OSS is not available from NGINX, Inc. but there are several options available from other ISVs in the Azure Marketplace.
Searching for “NGINX” in the Azure Marketplace will produce several results as shown below:
NGINX Open Source Software (OSS) is free while NGINX Plus is a commercial product that offers advanced features and enterprise-level support as licensed software by NGINX, Inc.
NGINX Plus combines the functionality of a high-performance web server, a powerful front-end load balancer and a highly-scalable accelerating cache to create the ideal end-to-end platform for your web applications. NGINX Plus is built on top of NGINX open source.
For organizations currently using NGINX open source, NGINX Plus eliminates the complexity of managing a “do-it-yourself” chain of proxies, load balancers and caching servers in a mission-critical application environment.
Azure provides several options for managed load balancing services:
- Azure Load Balancer
- Azure Application Gateway
- Azure Traffic Manager
Each of these services will be reviewed to understand when to use each service effectively.
The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers:
- Layer 7: The application layer
- Layer 6: The presentation layer
- Layer 5: The session layer
- Layer 4: The transport layer
- Layer 3: The network layer
- Layer 2: The data-link layer
- Layer 1: The physical layer
The OSI model doesn’t perform any functions in the networking process. It is a conceptual framework to better understand complex interactions that are happening.
Load balancers have evolved considerably since they were introduced in the 1990s as hardware-based servers or appliances. Cloud load balancing, also referred to as Load Balancing as a Service (LBaaS), is an updated alternative to hardware load balancers. Regardless of the implementation of a load balancer, scalability is still the primary goal of load balancing, even though modern load balancers can do so much more.
Optimal load distribution reduces site inaccessibility caused by the failure of a single server while assuring consistent performance for all users. Different routing techniques and algorithms ensure optimal performance in varying load-balancing scenarios.
Modern websites must support concurrent connections from clients requesting text, images, video, or application data, all in a fast and reliable manner, while scaling from hundreds of users to millions of users during peak times. Load balancers are a critical part of this scalability.
- Problems Load Balancers Solve
- The Solutions Load Balancers Provide
- The OSI Model and Load Balancing
Problems Load Balancers Solve
In cloud computing, load balancers solve three issues that fall under:
- Cloud Bursting
- Local Load Balancing
- Global Load Balancing
Cloud bursting is a configuration between a private cloud (i.e. on-prem compute environment) and a public cloud that uses a load balancer to redirect overflow traffic from a private cloud that has reached 100% of resource capacity to a public cloud to avoid decreases in performance or an interruption of service.