← Learning center

What are good and acceptable error rates?

What are good and acceptable error rates?

In this Learning Center article, we'll explain how to define good and acceptable error rates.

A good error rate is the percentage of errors you deem desirable for your application, whereas an acceptable error rate refers to the percentage of errors you view as acceptable for the application.

What is regarded as a good vs. acceptable error rate will vary depending on several factors, such as the context of your application, its use case, and the expectations of your users.

Generally speaking, lower error rates are desirable as they indicate higher reliability and customer satisfaction.

What is a good error rate?

To determine what constitutes a good error rate, you need to thoroughly understand your application's architecture and the consequence of any errors for your end-users. For example, if a customer can't add an item to their shopping cart, it's not as critical as if they can't access their bank account, so a banking application will likely have a lower good error rate than an e-commerce application.

Below is an example of a chart tracking an incrementing error rate in AppSignal:

Screenshot chart of error rate slowly rising

Weigh this up against your technical constraints, such as the limitations of your codebase and the practical and financial costs of maintaining a low error rate.

What is an acceptable error rate?

An acceptable error rate is the percentage of errors you deem acceptable for your application. This rate varies per application and should take into account the context and consequences of errors. You can apply these rates to your entire application or have different rates for different application components.

An acceptable error rate differs from a good error rate, which is the percentage of errors you deem desirable for your application.

Just like for a good error rate, you need a good grasp of your application's architecture and to take into account the consequence of any potential errors to define your acceptable error rate. Don't forget to think about technical constraints here as well.

In practice

An e-commerce application may have an acceptable error rate that's as high as 10%, but a good error rate of 1%, whereas a banking application may have an acceptable error rate of 1% and a good error rate of 0.1%.

Below is an example of a chart tracking a low and stable error rate in AppSignal:

Screenshot chart of error rate slowly rising

A good error rate is a goal you should aim to achieve and retain when developing and maintaining your application. However, an acceptable error rate takes into consideration that an application's actual error rate may sometimes be higher than what is defined as "good". You can use your own parameters to evaluate when an error rate reaches a level that requires immediate attention.

Track and define your error rate with AppSignal

AppSignal automatically tracks your application's errors, giving you real-time insight into your error rate. You can track this rate on an AppSignal metric dashboard, alongside key metrics like throughput, response times, recent error messages, and performance insights.

Screenshot of Dashboard showing error rate

You can also categorize errors by severity to differentiate critical issues from those that can be safely ignored:

  • Critical
  • High
  • Low
  • Informational
  • Non-critical

AppSignal provides the data you need to make informed decisions about error rates. Get started in minutes with a free trial, and reach out for dev-to-dev support if needed!

Start your free trial

Don’t let the bad bugs bite. Try AppSignal for free.

AppSignal offers a 30-day free trial, no credit card is required. All features are available in all plans. Start monitoring your application in just a few clicks!