Abstract
Rust + Edge AI = Game Changer! See how zero-cost abstractions deliver 92% faster inference, 99.997% uptime, and memory safety that C++ can’t match. Real autonomous vehicle code, live benchmarks, and the crates powering tomorrow’s intelligent devices. The future runs on Rust!
Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hi everyone.
Good morning.
Thank you for ha attending my session and also a big thank you to the
organizers for having this conference.
I'm Anush Wicker.
I am the software engineering manager at ai and today I would like to tell you
about my interest of using Rust with ai.
Russ has been redefining edge AI combining safety, performance, and
fine-grain control to meet the real world constraints of embedded intelligence.
Now we do see that the Edge AI market is exploring with the 44.2% of CAGR.
Now, this is not a hype.
This is just the adoption driven by real performance demands,
27.8 billion market size by 2026.
That means there's a lot of app opportunity for rust developers.
Also, the less than 50 millisecond latency as a holy grail for edge
ai, which the cloud infrastructures cannot reliably deliver This.
We need an optimize on device computation.
Okay, let me walk through some of the advantages of using rust on Edge ai.
First is the memory Safety.
Without garbage collection, we can avoid memory leaks and data
traces with the garbage without a garbage collector slowing us down.
Also, we do have zero cost abstractions, so the ability to write clean, expressive
code without paying in performance.
Also, it has a fine grain system controlled, which means that an edge ai,
every byte in every CPU cycle matters.
Let's talk about some of the real world performance advantages
that we have with Rust.
There is a 92% performance improvement over other languages,
and these are production numbers.
Also, 82% cost reduction as a big business driver also.
73.2% of memory usage reduction.
Now, this can be the difference between fitting your model
on device or not at all.
Let's talk about some of the secret weapons for using Rust of Edge ai.
First is the safe concurrency for a multi-core arm, we can scale safely
across course without runtime penalties.
Also using an efficient single instruction multiple demand with portable SIMD.
That is two of six x speed without handwritten assembly.
There is also a seamless hardware acceleration integration
that is from NPUs to DSPs.
Rust plays nicely with them.
Even in no SDD environments,
let's talk about a case study on autonomous vehicles.
The challenge that we have here is we need a 15 millisecond obstacle
detection challenge, and also we need a 99.2% top detection success rate.
12.7 millisecond inference time and 99.99% uptime.
That is why rusts is reliable in safety critical applications.
Rusts compile time also guarantees eliminating a lot of runtime errors, which
are actually being plagued to c plus.
This applications.
Let's talk about a case study, which is a different one now.
It's called Industrial IOT Security.
There is a 99.85% of threat detection accuracy, which is
observed on modest hardware.
With this rust on edge ai also we have, we see a 22 millisecond response time
and a 4.8 years of battery life with.
P hardware.
So we do see a predictable performance, which makes low power
edge devices viable for yours.
The Edge AI ecosystem consists of candle code, which is a high performance
network library, which is used for inferences with minimal dependencies.
Next there is a burn, which is a deep learning framework with strong
type system and modular backends, which enables model training.
And finally, we have a smart core, which is machine learning, library implementing
traditional machine learning algorithms and is also compatible with no SD.
Now this ecosystem is rapidly maturing, which creates new creates and capabilities
emerging monthly to support the entire edge AI development lifecycle.
Let's talk about the implementation patterns for a
custom neural network operators.
So whenever we are creating a neural network operator for rust.
In a GI, we need precise optimization.
Also, the there, these are some of the key patterns to achieve
maximum performance and efficiency.
With no heap allocations, we achieve no STD compatibility, which is very crucial
for highly constrained embedded systems.
And also guarantees deterministic and low latency operations.
Optimized cache utilization, which will be implementing cache friendly memory
access patterns dramatically reduces the memory latency leads to a faster
processing time and execution speed.
Manual loop unrolling enhances the CPU in instruction pipeline efficiency,
and also provides a high predictable execution times for real time edge
AI task direct slides manipulation.
Optimized to remove the bound checks in the release mode,
ensures maximum performance.
While maintaining RUS score safety GA guarantees.
Now let's talk in detail about the implementation patterns for optimizing
the arm and the risk processes.
Russ lets us combine binaries that fully exploit the hardware.
So for example.
We can use Garbo features and CFG attributes to enable the instruction
sets, which are specific to A CPU.
And then the RAs compiler generates a highly optimized ARCHITE specific binaries
that actually leverage the capabilities of ARM and the risk P processes, which
eventually leads to a peak performance.
Okay.
Using rust is actually better as we have some of the drawbacks
of like c programming language which can be nullified.
One thing is the direct safe control of the hardware, which is possible
with memory wad hardware, accelerators.
The real world impact of using rust before and after are mentioned in this slide.
Rust improves with the memory related crashes.
Also, it reduces the average inference time to 12.3 milliseconds.
It also helps with the heat dissipation, issues with cooler operating temperatures,
and it has been instrumental in extending the battery life by 4.7 years.
Also the mist detection rate has been reduced to 0.08 percents now.
Rust actually changes this equation and actually is more capable with
less cost and more reliability.
Some of the key takeaways from today's session is that Rust and Edge AI will
actually enhance the performance of the next generation intelligent systems.
Rust is also memory safe without any compromises, so we can eliminate
in an entire class of bugs.
Without sacrificing performance of systems programming for mission critical
edge applications, the Rust Edge AI ecosystem is also rapidly maturing with
high quality of libraries and frameworks.
So there are a lot of opportunities and alternatives while offering superior
safety guarantees the production deployments, which you demonstrate trust.
Have advantages of 90% of performance improvements, 70% of memory
reduction, and 99.9% reliability.
As edge computing continues its explosive growth, rust is position to
become the foundation of intelligence systems that combine unprecedented
performance with reliability.
You can always start with rust on your system by using cargo at
Kind Core Burn and Smart Core.
Please let me know if you have any questions.
You can reach out to me via chat and also on my LinkedIn profile.
And once again, thank you to the organizers of this conference.
Also you, the audience.
Okay.