Edge Intelligence

Edge computing devices & analytics for a smarter Internet of Things.

Contact Us
Low Power AIoT

Embedded AI empowering a new wave of advanced sensing

At EdgeMachines we develop cutting edge end-to-end IoT & edge computing systems for the modern world using open tools, protocols and standards. Our hardware and software platform allows us to deploy low-power devices in the real-world without the need for expensive, high-bandwidth infrastructure. Using our AI models and FPGA & ASIC Accelerators we can create advanced IoT systems that gather complex data from the real world. From harsh industrial environments to remote national parks find our how our AIoT Platform can monitor your world.

Smart Cities & Smart Campuses

Smarter Spaces. Safer Places.

EdgeMachines StreetNet.

StreetNet is an end-to-end implementation of the EdgeMachines AIoT platform into a standardised luminaire extension module. Helping Local Governments, Schools, Universities and Corporates create smarter places.
EdgeMachines StreetNet

Smart Street Lights that can see and hear the world around them.

Thanks to StreetNet’s low power design it is capable of proving smart object detection & tracking while still meeting the limited power requirements of the Zhaga Book18 standard. StreetNet sensors don’t need additional batteries or external power sources (e.g. solar), greatly reducing ongoing maintenance costs while still allowing the sensor to operate 24/7.

Zhaga Socket Compatible
EdgeMachines StreetNet is plug and play, meaning any Zhaga Book18 certified lumaniere is compatible with our sensors greatly minimising installation cost. The sensor automatically learns parking spots and traffic trajectories. Simply record the location where the module was installed & configure any parking time restrictions on our provisioning app, wait for it to join your chosen LPWAN and you’re done.

Track more than traffic and pedestrians

Capture a wealth of information previously unavailable in traditional smart cities sensors. Traditional traffic sensors using pressure, motion sensors, radar or lidar are generally great at detecting moving objects however they have a poor understanding of the static environment around them. Using embedded Computer Vision, StreetNet enables a broader range of sensing outputs suited to the varied street lighting applications including roadways, parks, parking lots, and school, University & corporate campuses. Simplify your IoT stack by investing in one platform that can be across your varied fleet of lighting assets.

Count vehicles, bikes and people
Track object trajectories
Create smart parking lots
Track parked car time limit compliance

From traffic & cycleway monitoring to smart parking lots & curb side parking our sensor can provide a wealth of data streamed directly to the StreetNet dashboard & app or accessed via open API.

Helping create cleaner and more efficient spaces

Environmental Sensors
Built in sensors track temperature, humidity, barometric pressure, vibrations (from wind etc) and particulate matter & VOCs in the air. Monitor the urban heat island effect and detect pollution.
Smarter waste management
Detect trash on walkways, curbs and roads. Detect assets like public trash cans. Use data gathered to guide smart street sweeping, snow plowing, better trash collection scheduling and minimise waste entering stormwater drains & waterways.
Energy Efficiency
Helping to build greener cities, use activity data to enable dimming of street lights to save power. Control via DALI-bus integration – coming soon.

Smart Street Lights that can see and hear the world around them.

Thanks to StreetNet’s low power design it is capable of proving smart object detection & tracking while still meeting the limited power requirements of the Zhaga Book18 standard. StreetNet sensors don’t need additional batteries or external power sources (e.g. solar), greatly reducing ongoing maintenance costs while still allowing the sensor to operate 24/7.

Zhaga Socket Compatible
EdgeMachines StreetNet is plug and play, meaning any Zhaga Book18 certified lumaniere is compatible with our sensors greatly minimising installation cost. The sensor automatically learns parking spots and traffic trajectories. Simply record the location where the module was installed & configure any parking time restrictions on our provisioning app, wait for it to join your chosen LPWAN and you’re done.

Track more than traffic and pedestrians

Capture a wealth of information previously unavailable in traditional smart cities sensors. Traditional traffic sensors using pressure, motion sensors, radar or lidar are generally great at detecting moving objects however they have a poor understanding of the static environment around them. Using embedded Computer Vision, StreetNet enables a broader range of sensing outputs suited to the varied street lighting applications including roadways, parks, parking lots, and school, University & corporate campuses. Simplify your IoT stack by investing in one platform that can be across your varied fleet of lighting assets.

Count vehicles, bikes and people
Track object trajectories
Create smart parking lots
Track parked car time limit compliance

From traffic & cycleway monitoring to smart parking lots & curb side parking our sensor can provide a wealth of data streamed directly to the StreetNet dashboard & app or accessed via open API.

Helping create cleaner and more efficient spaces

Environmental Sensors
Built in sensors track temperature, humidity, barometric pressure, vibrations (from wind etc) and particulate matter & VOCs in the air. Monitor the urban heat island effect and detect pollution.
Smarter waste management
Detect trash on walkways, curbs and roads. Detect assets like public trash cans. Use data gathered to guide smart street sweeping, snow plowing, better trash collection scheduling and minimise waste entering stormwater drains & waterways.
Energy Efficiency
Helping to build greener cities, use activity data to enable dimming of street lights to save power. Control via DALI-bus integration – coming soon.
Detect Impulse Sounds

Sound sensing for a hollistic world view

Beyond simply monitoring noise levels in decibels, sound sensing uses our embedded AI hardware to continually process & classify sounds on the device. Because the device does all the processing, no audio data is ever recorded or transmitted off the device.

Impulse Sounds
Embedded AI models are always listening allowing detection of short but important sound events such as a car crash or gun shot. Create safer places and enable local authorities to be proactive with real-time notifications of compliance breaches.
Conservation & Protection
Our sound AI models are capable of classifying human-made noise into over 16 categories. From construction & traffic noise disturbing local residents to chainsaws being used in National Parks, sound provides a wealth of new information.

Classify environmental sounds to detect traffic hooning cars chainsaws gun shots crashes barking dogs yelling sirens construction fighting wildlife

8:02AM Traffic Noise
9:18AM Construction Noise
1:22PM Sudden Crash Possible car accident
3:51PM Dog Barking
5:10PM Traffic Noise
6:35PM Construction Noise Compliance breach
10:55PM Loud Music
11:20PM Hooning Vehicles Notify Local Authorities
01:14AM Screaming Notify local authorities

Advanced cameras combined with broad-spectrum microphones, onboard AI acceleration, and calibrated environmental sensors make the EdgeMachines' IoT platform uniquely capable.

Optional depth vision
Stereo Vision for applications where distance matters
Hear everything
With in-built microphones, detect events cameras can’t see
Embedded AI Acceleration
Optimised AI models that run on FPGA or ASIC AI Accelerators

Exceptionally Power Efficient

High TOPS per Watt
Low Power Optimised Algorithms
Optional Solar Power & Battery for 24/7 Operation
NVIDIA Jetson Nano - 10 Watts
0%
RaspberryPi 4b - 6 Watts
0%
Google Coral Dev Board - 5 Watts
0%
EdgeMachines - 2-3.5 Watts*
0%
Kendryte K210 + ESP32 - ~1 Watt
0%

Lower is better
* up to 3.5W when using spatial AI

Spatial AI. Model Distance.

When every moment counts, AI-powered depth perception gives important context

Real-time identification of hazards in the workplace and home settings can be performed on device. Stero Vision allows our sensors to understand depth within a scene.

Stero Vision
Embedded spatial AI allows for advanced sensing applications without the need for expensive hardware (e.g. LIDAR). Accurately estimate object size & distance between objects.
Human-Machine Safety
Detect dangerous interactions between humans and other objects, monitor social distancing or identify heavy vehicles.
Open Platform. Easy to Integrate.

Embedded AI hardware & sensors built around an open platform.

We focus on developing products & projects that use open APIs, putting data ownership in the hands of our clients, reducing installation & processing costs, increasing ease of integration and reducing lock-in risks.

+0

Communication Protocols & Data Integrations work with EdgeMachines, allowing you to build on your existing IoT investment

More integrations can be added as needed for your use case
LoRaWAN
WiFi
The Things Network (TTN)
MQTT
NB-IoT
LTE-M
OpenBalena
GCP IoT Core
Speak with a Solution Engineer

Privacy built in. All your data are belong to you…

We are now entering the age of edge computing, with devices becoming more intelligent and users become more privacy conscious, edge computing allows us to deploy cloud-processing-free sensor solutions. Performing computation & inference on the device at the edge, means devices require less. Our sensors are design to operate over low bandwidth, low power communication networks (LP-WANs) and are faster to deploy.

Privacy first design
By performing data analysis & machine learning on the device itself, video, motion and sound data never needs to leave the device, its never broadcast over the internet or processed in the cloud. Our sensors tell you whats happening without having to show you.
Open protocols
Our communication protcols and APIs are open to use, meaning you can quickly integrate with the sensor directly and because the device does the analytics you don’t even need to use our cloud service.
Idea. Prototype. Production.

Use our off-the-shelf or bespoke component design services to achieve your sensing goals.

By leveraging our base embedded device platform along with our expertise in data science, computer vision, machine learning and edge computing we can provide unique, practical solutions to previously infeasible or uneconomic sensing and data challenges.

End-to-end Integration
Systems don’t live in isolation, nor should your data. We specialise in the integration of multiple assets and systems to power data driven organisations.
Secure Apps
Gathering data is one step in the pathway to embracing IoT and Edge Computing. To put this information in the hands of your users we have expert web and mobile app developers to put monitoring and control in the palm of your hand.

Contact Us to take your ideas to production

North Sydney, Australia

Use EdgeMachines AIoT Platform to build smarter Cities Roads Workplaces Campuses Homes Schools Waterways Shopping Malls