Header image
embedded system to automate transportation platforms

Predicting the future

Intro

EasyMile's software for driverless solutions processes data fed by any platform’s sensor set, analyzes it, and teaches the vehicle how it is supposed to perform.

Collecting and crunching data in real-time 

Autonomous vehicles require a high level of information to operate safely and so are equipped with a full range of sensors.

These sensors collect and crunch recorded data to create a 360-degree picture of the environment, including infrastructure, other vehicles, pedestrians, and anything else in the path.

Real-time processing of the data allows the driverless vehicle system to decide how to behave to progress safely along the road (stop, go, or slow down etc).
 

EasyMile’s unique technology allows autonomous vehicles to drive safely with cars, bikes, pedestrians and so on. They understand the environment as it is. No need to build or make anything extra!

Eyes everywhere 

Driverless technology uses a range of sensors to see the environment. 

Various complex processes are required to enable autonomous function and intelligence is fundamental.

EasyMile’s unique in-house software package includes an embedded system to automate transportation platforms. With a full range of sensors, it collects and crunches recorded data creating a full picture of autonomous vehicles’ environment.

Our unique algorithms are specifically designed to adapt to the sensor set of any autonomous vehicle.

Keeping in touch continually

A continuous rendering of the surrounding environment and the prediction of possible changes is fundamental for driverless solutions. Autonomous vehicles equipped with our software are in constant communication with the EasyMile supervision center.

These tasks are mainly ensured by 3 key factors:

  • Localization - Where is the vehicle?

  • Perception -  What is surrounding the vehicle?

  • Path Planning -  What will the vehicle do next?

Localization

Using all the available data from the different sensors in a fusion algorithm, autonomous vehicles know their position and the accuracy of it at all times. Any potential deviation will safely slow down or stop the vehicle. In addition, the vehicle is in constant communication with both its environment and the EasyMile supervision center.

EasyMile’s powerful localization algorithm calculates the vehicle’s position with centimeter-level accuracy in real-time by fusing data obtained from the sensor set, comprised of:

  • Lidars (Laser Detection And Ranging),
  • Differential GPS (global positioning system) and RTK (real-time kinematic positioning),
  • Inertial Measurement Unit (IMU),
  • Odometry.  

EasyMile’s satellite navigation technology is a multi-GNSS system that processes GLONASS as well as the original GPS. The system’s precision is enhanced by real-time Kinematic (RTK) processing. The GNSS position is also used in conjunction with information from the 3G or 4G grid, with 5G testing underway. 

 

It is our combination of this intelligent and sophisticated technology that produces the highly accurate localization required for EasyMile’s autonomy.

Perception

A full range of high-performance sensors ensures the vehicle has the widest angle of vision possible combined with low ground clearance. It is therefore able to detect and avoid potential obstacles on the road. Depending on the situation it will either: adapt its speed and trajectory, safely control the brake, or stop.

Autonomous vehicles using EasyMile’s software are equipped with multiple layers of redundant systems in order to maximize the safety of passengers and road users.

For example:

  • Redundant coverage from a full range of sensors, and constant monitoring of the perfect functioning of each sensor at all times.
  • Redundant obstacle detection function.
  • Fail-safe and redundant braking systems.

Path Planning

EasyMile’s autonomous software supports enhanced navigation features, including obstacle avoidance, V2X communication, predictive control, as well as decisions at intersections and pedestrian crossings. The vehicle is programmed to a three-dimensional site map with all elements triggering specific behaviors, for example in a fixed speed area. 

All these features allow smooth navigation through complex scenarios, taking into account, in real-time, the dynamic surrounding environment of the vehicle, and with an ability to adapt its trajectory as required. The vehicle achieves this by being in constant communication with both its environment and the supervision center thanks to a 4G data connection network.

Our navigation stack is also agnostic, meaning it can be integrated with most vehicles very quickly and efficiently.
 

You may also like