← Learning center

What are good and acceptable error rates?

What are good and acceptable error rates?

In this Learning Center article, we'll explain how to define good and acceptable error rates.

A good error rate is the percentage of errors you deem desirable for your application, whereas an acceptable error rate refers to the percentage of errors you view as acceptable for the application.

What is regarded as a good vs. acceptable error rate will vary depending on several factors, such as the context of your application, its use case, and the expectations of your users.

Generally speaking, lower error rates are desirable as they indicate higher reliability and customer satisfaction.

What is a good error rate?

To determine what constitutes a good error rate, you need to thoroughly understand your application's architecture and the consequence of any errors for your end-users. For example, if a customer can't add an item to their shopping cart, it's not as critical as if they can't access their bank account, so a banking application will likely have a lower good error rate than an e-commerce application.

Weigh this up against your technical constraints, such as the limitations of your codebase and the practical and financial costs of maintaining a low error rate.

What is an acceptable error rate?

An acceptable error rate is the percentage of errors you deem acceptable for your application. This rate varies per application and should take into account the context and consequences of errors. You can apply these rates to your entire application or have different rates for different application components.

An acceptable error rate differs from a good error rate, which is the percentage of errors you deem desirable for your application.

Just like for a good error rate, you need a good grasp of your application's architecture and to take into account the consequence of any potential errors to define your acceptable error rate. Don't forget to think about technical constraints here as well.

In practice

An e-commerce application may have an acceptable error rate that's as high as 10%, but a good error rate of 1%, whereas a banking application may have an acceptable error rate of 1% and a good error rate of 0.1%.

A good error rate is a goal you should aim to achieve and retain when developing and maintaining your application. However, an acceptable error rate takes into consideration that an application's actual error rate may sometimes be higher than what is defined as "good". You can use your own parameters to evaluate when an error rate reaches a level that requires immediate attention.

Track and define your error rate with AppSignal

AppSignal automatically tracks your application's errors, allowing you to see your error rate in real time. You can also specify if an error is critical or something that can safely be ignored (non-critical) by giving your errors one of the following severity labels:

  • Critical
  • High
  • Low
  • Informational
  • Non-critical

AppSignal provides the data you need to make informed decisions, like how to define a good error rate. You can start tracking your application's key metrics in as little as five minutes with a free trial. We provide dev-to-dev support if you need assistance!

Start your free trial

Don’t let the bad bugs bite. Try AppSignal for free.

AppSignal offers a 30-day free trial, no credit card is required. All features are available in all plans. Start monitoring your application in just a few clicks!