Web Application Vulnerabilities and Control

  • January 20, 2021
  • News
No Comments

Web Application Vulnerabilities and Control


Security of web applications is a hot topic because once an intruder gains access to an application, it becomes a breeding ground for their malware.

To detect and overcome this challenge, researchers have developed client-side and server-side mechanisms to enforce web application security.

According to the Communications Usage Trend Survey, many websites are hosting services online. For secure communications with clients, most websites use SSL encryption, network firewalls, and other security devices.

However, these applications cannot prevent application attacks. More than 75% of attacks are targeted at web applications.

In this blog, we use the application vulnerabilities mentioned in the OWASP (Open Web Application Security Project). The rest of this blog highlights the tools used by hackers, and finally a conclusion.

Web Application Vulnerabilities


Happens when an interpreter receives untrusted data as part of a query or command. Examples are SQL, LDAP, and OS injection flaws. Protecting against injection flaws is to simply avoid accessing external interpreters as much as possible.

There are language-specific libraries that perform similar functions for some system calls or shell commands. Using such libraries avoids a large number of shell command problems since they don’t involve the shell interpreter of the operating system.

For calls that must still be employed such as those to backend databases, data provided must be carefully validated to ensure it is malware-free. Requests can also be structured in a manner that treats all parameters supplied as data and not executable content.

Stored procedures provide significant protection by ensuring that supplied content is treated as data. Risk external calls will be effectively reduced but not eliminated. Such input must always be validated to ensure it meets the application’s expectations.

Ensuring that a web application only runs with needed privileges for performing its functions will also help against command injection attacks. Therefore, avoid accessing a database or a web server as root because an attacker might abuse these administrative privileges granted to the web application.

All call outputs and error codes should be checked to ensure the expected processing occurred. The errors must be checked to set up mechanisms to handle possible blockages, timeouts, or errors.

Common Types of Cyber Attacks to Protect Against

Cross-Site Scripting (XSS)

This attack occurs when an application sends untrusted data to a client browser without validating. XSS allows an intruder to execute scripts in the victim which spy on user sessions and redirect users to malicious websites.

To overcome XSS, your application should perform vigorous checks against defined specifications. It should validate all parameters and we recommend a positive security policy that defines only what should be allowed rather than a negative or signature-based security policy which might be incomplete.

OWASP has produced multiple language reusable components to help developers prevent parameter tampering cases, including XSS attacks.

OWASP has released a firewall called Code Seeker that works at an application-level as well as a training program called Web Goat that gives free lessons on data encoding and Cross-Site Scripting.

Broken Authentication and Session Management

Improper implementation of these functions allows attackers to compromise a user identity and exploit a vulnerability. An attacker can steal session tokens, keys, and passwords.

The most suitable way to reduce the chances of this attack is the careful use of off the shelf or custom authentication and session management mechanisms. A good first step would be defining and documenting the policy of a site concerning managing users securely.

The secret to having a session and account that is secure and robust is using management mechanisms to consistently ensure that the implementation correctly enforces the policy.

Some common criteria include:

Password change requests
Password strength checks 
Session ID protection
Browser caching
Backend authentication

Dashlane vs Lastpass – A List of 10 Best Password Managers

Insecure References to the Direct Object

Occurs when the administrator unknowingly exposes an internal implementation object like a directory or file. Attackers can manipulate these references and access critical data if no protection exists.

Avoiding exposing referencing objects to users using easy to validate indirect methods would be the best prevention strategy. A user must be authorized before using a direct referencing object. It is important to have an established way of referencing application objects.

Limit referencing private objects such as filenames or primary keys to users whenever possible
Use the “accept known good” approach to extensively validate references of private objects
Verifying all object references

Cross-Site Request Forgery (CSRF)

An attacker will force a victim to send requests which the server will think are legitimate.

The requests are sent to the vulnerable web application in the form of forged HTTP requests which include the session cookie of a victim and other identification information.

Applications should ensure that they not only rely on tokens that are received from browsers but also use custom tokens that will not be remembered by browsers to initiate a CSRF attack.

All web applications should implement the following strategies:

Ensure the application is not vulnerable to XSS attacks
Reauthentication and transaction signing can be used in very sensitive data transactions to ensure that the request is genuine.
Avoid using GET requests on critical data or for performing value transactions. The POST method should use sensitive data from the user. A random token is contained in the URL that creates a unique URL which makes it impossible to perform CSRF.

Security Misconfiguration

A secure environment will have a properly configured application environment. Most of these settings are shipped with their defaults so they should be well defined and maintained.

The first step should be to harden the web server and application server. The configuration is used on all application hosts as well as development environments.

We suggest using an already established hardening guidance available from vendors or various security organizations such as OWASP or CERT and then customizing them to specific needs.

The following topics should be included in the hardening guideline:

Security mechanisms configuration
Turning off unused devices
Set up accounts, roles, and permissions including changing the default password and disabling default accounts
Logs and alerts

After configuring a guideline, it will be used to fix and maintain the servers. An automation tool can be used to automate the configuration process especially when a large number of servers is involved.

Several tools already exist which can be slightly configured to meet the particular policies.

It needs vigilance to keep the server configuration secure. Someone should be assigned the responsibility of ensuring the server is up to date. The maintenance process includes:

Ensuring the latest security patches are applied
Monitoring the latest published security patches
Updating guideline for security configuration
Regularly scanning vulnerability of applications both internally and externally
Regular security reviews of server configuration 
Documenting security status reports to management regularly

9 Web Security Tools to Identifiy Vulnerablities

Insecure Cryptographic Storage

Failing to properly encrypt critical data e.g., credit cards using modern techniques puts the data at risk because attackers can modify weakly protected data and commit a crime.

Everything that requires encryption must be checked and it should be done properly. The following should be taken into consideration to ensure proper handling of cryptographic material:

Only approved hashing algorithms such as AES, SHA-256, and RSA should be used
Weaker algorithms such as MD5/SHA1 should be avoided
Avoid transmitting private keys over insecure channels. 
Keys should be generated offline and stored with care
Make decryption of data stored on disks harder for attackers
Ensure that infrastructure credentials such as database details are properly encrypted and not easily decrypted by attackers

Failing to Restrict URL Access

Even though applications validate access rights to URLs before rendering protected buttons or links, similar access control checks should be performed every time the pages are accessed to keep off attackers who would have otherwise forged URLs to access the hidden pages.

To protect against unrestricted URL access, you need to create a matrix that maps application roles and functions.  Applications should configure access controls on every URL.

Checking authorization once and then not checking in subsequent steps is not sufficient. Considerations to prevent this attack include:

Ensuring the access control configurations are part of application design and business processes
Ensure protection of business functions and all URLs to verify the user’s roles before any processing by using an effective access control mechanism.
Performing a penetration test before deployment to ensure that the application is vulnerability-free
Ensuring that library files are not included in the web root folder. 
Never assume that users are unaware of hidden URLs and instead ensure the protection of administrative functions 
Restrict access to some file types that might never be served by your application. This means allowing what the application should serve only e.g. .html, .php, .pdf. This will block unintended files e.g. .xml that is not intended to be directly served
 Keeping components such as XML processors and word processors which handle user-supplied data and ensure it is up to date.

Insufficient Transport Layer Protection

Sometimes systems use weak algorithms or expired certificates or fail to use encryption on the traffic going through network traffic even when it is necessary. The following things should be verified to ensure transport layer security:

All authentication traffic is protected using SSL
Support for strong algorithms only
Browser doesn’t transmit session cookies in clear because they all have their secure flags
The certificate of the server is legitimate and is well configured for that particular server. This includes- not expired, an authorized issuer issued it and matches all site domains.

Transport layer protection can affect a site’s layout. Requiring an SSL for a whole site is easier but some sites require SSL for private web pages only to boost site performance.

However, if this can expose session IDs. The following are the minimum requirements:

SSL should be required for every sensitive page and all non-SSL requests should be transferred to the SSL page
All sensitive cookies should have secure flags on them
Only strong algorithms should be supported on the SSL service provider 
Ensure the certificate matches all domains used by the site, is not expired or revoked, and is valid 
Backend connections should implement SSL or other encryption techniques

Migration from HTTP to HTTPS: An Overview

Web applications have the capability of redirecting and forwarding users to other webpages.  Criminals can manipulate victims to view compromised sites without proper validation.

The following gives ways to discover whether an application has unvalidated redirects and links:

Reviewing all redirect and forwarding code to verify that parameters contain only allowed destination or destination element
Spidering the site to check if it contains redirect links. If the HTTP response code is 300-307, confirm whether the parameters supplied seem to contain a target, change the URL of the target and check if the site will redirect to another site
For unavailable code, all parameters should be checked to verify whether they belong to a redirected URL destination and test the URLs that do

To remain on the safer side, the following should be implemented:

Limit the usage of redirects and forwards
If used, limit the usage of user parameters in destination calculation
Ensure that the value supplied for a user is valid and authorized if destination parameters cannot be avoided

Web Application Attack Tools

Wget-downloads web pages and visits them offline
Web sleuth-scans XSS bugs on sites
Instant source-carefully view website source
Window bomb-checks incoming queries on websites
Burp-web security application for java apps
CURL-scans URL for the format, Unicode, characters, and structure
Black window-gets all website files up to the root directory
Acunetix-one of the very best web scanners and is very fast


Messages travel through seven layers in the OSI reference model to reach the application layer which contains HTTP protocol to transport messages carrying content such as SOAP, XML, or HTML.

Most attackers know how to make HTTP requests look simple and gentle but carrying data that is very harmful. Web app attacks can alter website content, execute system commands remotely, and even grant the attacker full access to a database.

In this article, we highlighted the most important web application security vulnerabilities and pointed out their most fundamental solutions.

We also presented the most used web attacking tools. By identifying vulnerabilities and their solutions, servers can offer the best performance against malicious activities.

NaenMedia is a renowned  website design company in India , offering Emerging Technology Services to our clients across the globe. We offer all kinds of web design and web development services to our clients using the latest technologies. We are also a leading digital marketing company providing SEO, SMM, SEM, Inbound marketing services, etc at affordable prices. For further information, please contact us.
The post Web Application Vulnerabilities and Control appeared first on Web Solutions Blog.

NaenMedia.in/web-application-vulnerabilities/”>Go to Source

About us and this blog

We are a digital marketing company with a focus on helping our customers achieve great results across several key areas.

Request a free quote

We offer professional SEO services that help websites increase their organic search score drastically in order to compete for the highest rankings even when it comes to highly competitive keywords.

Subscribe to our newsletter!

More from our blog

See all posts