Best Practice Guide for Tire Sidewall Information Scanning

Introduction

In this article, you will find additional information and best practices for data capture with Anyline. For any comments or questions, please get in touch with your contact person at Anyline.

 


General Quality Drivers

Environmental Conditions

Challenging environmental conditions and tire degradations can lead to a suboptimal scanning performance. This can be reflected through lower scan accuracy or a longer time needed to complete a successful scan result. Below, you will find a list of the most common factors which can be the cause of suboptimal scanning performance.

 

A very simple rule applies for all quality drivers in general: “If the human eye can’t read the text, the technology will most likely struggle too.”

Conditions and Factors

Correct Practice

Incorrect Practice

Conditions and Factors

Correct Practice

Incorrect Practice

Low Lighting Conditions

Low light reduces the contrast of the image, making it harder to separate the characters from the background. Additionally, the phone camera will compensate for the low light by increasing the sensitivity of the sensor (ISO) and exposure, which raises noise and motion blur on the image.

Try using external light sources or even turn on the flashlight on your device (if available).

DOT/TIN Example

Worn Tires / Abrasion

If a tire is scraped or damaged, characters may be faded or no longer visible. In both cases, this can lead to OCR misclassification*

In case of worn tires, double-check if the characters can be read with a human eye first.

Tire Size Example

Cutout Position

Our Scanner performs the best with all the characters aligned within the centre of the cutout.

Any characters beyond the border of the cutout will hinder the scanning performance and cause the results to be less accurate.

Tire Make Example

Focus

Any image with little to no focus in the shot will either result in an inconclusive scan or an inaccurate reading.

To avoid this, please ensure that your camera is focused on the characters with every part of the scannable object being clear and readable.

Tire Commercial ID Example

Dirt

Dirt can affect the scanner in various ways. If dirt partially or fully covers characters, they may be misread, or not detected at all (no result).

In that case, try to remove the dirt from the tire and scan again.

Tire Model Example

High or Low Angles

To achieve more accurate results and a better overall scanning experience, align your device so that the camera of choice is directly in line with the scannable item. If the angle is too low or too high, the characters may not be as clear for the scanner to get an accurate read.

Also, avoid using angles where any textured backgrounds could surround the scannable area.

DOT/TIN Example

Reflection

Reflection may cause multiple problems, including over-exposure which lowers contrast, or full occlusion of certain parts of the tire. This can lead to OCR misclassification and/or incorrect scan results.

In that case, try to improve the light conditions (different or additional light source, blocking direct sun) and scan again.

Tire Size Example

NOTE - For Worn Tires / Abrasion:
Background texture recognition improved after v37.

 


Accuracy can also be negatively influenced by a number of factors relating to the device being used (e.g. the camera quality). Below, you will find the most important properties to consider when selecting your scanning device.

The resolution of the camera directly influences how much information is available to be processed by the recognition system. The higher the resolution of the camera, the higher the probability of more accurate result.

Having a high dynamic contrast also contributes to a better and more accurate scan result.

Improper focus caused by a poor auto-focus or incorrect handling will result in a blurry image.

To avoid this, try to hover over the scanned object for a little longer, whilst keeping the device still, and allow the camera to re-focus.

A large distance between the scanned object and the camera can have a similar effect on the recognition rate as a low resolution camera. Even if the scannable object is found by the detection algorithm, the accuracy can be expected to be low, due to missing information.

Always try to get close enough to the scanned object and align the characters within the blue rectangle (cut out) for optimal scan results. 

Another important factor to consider, which can lead to a suboptimal scanning performance and influence user experience, is the processing power of the device.

The more processing power available, the more computer vision and machine learning operations can be performed per second.

 


 


Glossary