This repository contains the ESP-IDF firmware I wrote for my two-wheeled self-balancing robot.
I did not tune this project by jumping straight into full hardware tests. I verified the controller in three steps:
SILSto check whether the controller logic behaved as expectedHILSto check whether the actual ESP32 firmware and the Simulink model exchanged data correctlyreal robot testingto confirm disturbance recovery, speed response, and Bluetooth driving on hardware
This robot is not just a pair of wheels. I used GM4108H-120T BLDC wheel motors together with RX-28 joint actuators, so I wanted this README to explain not only what the code does, but also how I validated the controller before relying on the real machine.
- An ESP32-S3 balancing control loop
- A cascaded controller with the structure
velocity -> target pitch -> pitch control -> left/right motor voltage split - Encoder-based velocity estimation and 3-phase voltage generation for the BLDC motors
- Bluetooth joystick control for forward motion, stop, and turning
- A MATLAB/Simulink HILS setup connected over UART
- A validation flow that went from SILS to HILS and then to the physical robot
The main idea in this project was to separate "keeping the robot upright" from "making it move where I want."
imu_timer_init()runs the IMU-side update at200 Hzencoder_timer_init()runs the encoder-side update at1 kHz- The IMU path updates
pitch,yaw, androll - The encoder path calculates wheel angle, wheel speed, and the motor voltages that actually go to the BLDCs
The control flow is simple in concept:
- I do not send the target velocity directly to the motors.
- I first convert velocity error into a
target pitch. - I then compare the current pitch with the target pitch and generate the balancing output.
- I mix the yaw term into the left and right motor commands as a differential component.
- I generate
Vq_leftandVq_right, then convert them into 3-phase voltages for the BLDC motors.
In other words, the velocity controller decides how much the body should lean, and the pitch controller keeps the robot from falling while following that lean target.
The semaphore-based scheduling in this repository was part of the real hardware code path, not just a simulation-side idea.
encoder_timer_init()configures a GPTimer with1 MHzresolution andalarm_count = 1000, so the encoder side is triggered every1 ms(1 kHz).- Each encoder tick queues SPI reads for both wheel encoders, and
spi_post_callback()releasesencoder_semwithxSemaphoreGiveFromISR()as soon as the read is complete. motor_control_taskis created at priority5and blocks onxSemaphoreTake(encoder_sem, portMAX_DELAY), so the motor-voltage update runs only when fresh encoder data is ready.- If that encoder semaphore wakes a higher-priority task,
portYIELD_FROM_ISR()requests an immediate context switch right after the ISR finishes. imu_timer_init()uses another GPTimer withIMU_ALARM_COUNT = 5000, which means the IMU path runs every5 ms(200 Hz), and its ISR releasesimu_semfor the state-update path inapp_main().- Other real-time work is also separated:
RX28_Taskruns at priority4, andhc06_event_taskhandles Bluetooth packets through the ESP-IDF UART event queue at priority12.
This detail mattered a lot on the real robot. Before I used semaphore-based wakeups, the motors vibrated badly because the balancing loop timing was not deterministic. After I made the encoder-triggered motor path higher priority than the IMU path, and woke it directly from the encoder ISR, the vibration disappeared.
In hc06.c, I parse Bluetooth joystick packets in the S,x,y,diff,E format.
- Small inputs are ignored with a deadband
- Small joystick magnitude means stop
- Larger forward input becomes a forward velocity target
- Turning is handled as a left/right voltage difference
Because of that structure, forward motion and turning are not handled as separate disconnected modes. The steering command is layered on top of the balancing controller.
This robot also includes joint actuators, not just wheel control.
rx28.ccontrols fourRX-28actuatorsMAX485is used for TTL-to-RS485 conversionlidar.cincludes expansion code forYDLIDAR G2
The balancing controller is the center of this repository, but the code is organized so I can extend it with more sensing and joint-side behavior later.
I did not tune this controller by repeatedly throwing the real robot onto the floor and hoping for the best. I checked it step by step.
I first used SILS to see whether the controller response made sense in theory.
At this stage, I was mainly checking whether the response diverged, whether the controller could settle, and whether the balancing behavior looked realistic before I touched the real machine.
After SILS, I built an HILS setup to verify that the real ESP32 firmware could exchange data correctly with the MATLAB/Simulink model.
The point of HILS in this project was not just "simulation." I wanted to confirm that the exact firmware running on the ESP32 followed the same control flow I intended to use on hardware.
To do that, I connected the PC and ESP32 through a CP2102 USB-to-TTL module and used UART2 on the ESP32.
- ESP32 TX:
GPIO17 - ESP32 RX:
GPIO18 - PC-ESP32 serial bridge:
CP2102
MATLAB sends four simulated sensor values to the ESP32. In other words, the ESP32 is not receiving arbitrary test numbers here. It is receiving the same kinds of states it would normally read from the real wheel encoders and the IMU.
- from the virtual wheel encoders:
- right encoder value
- left encoder value
- from the virtual IMU:
- pitch
- yaw
Each value is a single(float), which means 4 bytes per value. That makes one packet 16 bytes in total.
According to the HILS flow document:
- encoder-side data is refreshed at
1 kHz - pitch and yaw are refreshed at
200 Hz
In practice, MATLAB converts those values into bytes with Byte Pack, then sends them over UART so the ESP32 can feed them into the controller.
The data rate also explains why I used a high baud rate:
16 bytes * 1000 = 16000 bytes/s- UART adds framing overhead because each byte also carries start and stop bits
- the document estimates the required line rate at about
160000 bps
Because of that, I used 921600 bps so the HILS link had enough margin.
On the return path, the ESP32 takes the received encoder values and attitude values, runs the controller, and sends back the final 3-phase voltages for the left and right BLDC motors.
That means:
- 3 phases for the left motor
- 3 phases for the right motor
- total
6 floatvalues
So the return packet size is:
6 floats * 4 bytes = 24 bytes
In short, the HILS loop works like this:
- MATLAB acts like the sensor side and sends a
16-bytestate packet to the ESP32 - the ESP32 calculates the motor commands and sends back a
24-byte3-phase voltage packet
On the Simulink side, Serial Receive reads the 24-byte payload first, and Byte Unpack reconstructs the 6 float values so the virtual BLDC model can use the same motor commands that would be applied on the real robot.
This stage let me verify the real firmware logic without repeatedly crashing the hardware during early tuning.
After HILS, I moved to the physical robot and checked disturbance rejection, velocity response, and Bluetooth driving.
In the real tests, I focused on three things:
- whether the robot could recover after an external push
- whether it leaned naturally to follow a speed target
- whether Bluetooth teleoperation could be added without breaking balance
One of the main hardware-side lessons was scheduling stability. The semaphore and priority structure was not just a clean software design choice. It was what removed the severe motor vibration I saw before the encoder-driven motor path was allowed to preempt the slower IMU path.
GitHub does not always render repository mp4 files nicely inside README.md, so I organized the videos as links.
- HILS test video
- SILS yaw test video
- SILS target velocity test video
- Real-world disturbance rejection test
- Real-world hold-position test
- Real-world Bluetooth driving test
- HILS structure reference PDF
Additional material:
These are the main parts I used in the final build:
- Main controller:
ESP32-S3-WROOM-1 N16R8 - Wheel motors:
GM4108H-120TBLDC motors x2 - Wheel motor drivers:
MKS SimpleFOC MINIx2 - Wheel encoders:
AS5048Ax2 - Joint actuators:
Dynamixel RX-28x4 - Communication converter:
MAX485 TTL to RS-485 - IMU:
WT901 - Bluetooth module:
HC-06 - HILS UART bridge:
CP2102 USB-to-TTL - Power:
3S LiPo battery,XL6009boost converters
I also reviewed or used these as support tools or expansion items:
U2D2U2D2 Power HubYDLIDAR G2MPU6050
For the final balancing tests, the IMU I actually used was WT901.
app_main.c: entry point, task creation, semaphore waits, and controller integrationencoder.c,encoder.h: encoder SPI timing, ISR wakeup, velocity estimation, and 3-phase voltage generationimu.c,imu.h: WT901 data acquisition and the 200 Hz IMU semaphore pathpid.c,pid.h: PID calculationpwm.c,pwm.h: PWM outputhc06.c: Bluetooth joystick inputrx28.c,rx28.h: RX-28 controllidar.c,lidar.h: LiDAR expansion codevariable.h: shared control variables and constants
This project is written for ESP-IDF.
- Move into the project directory.
- Open the ESP-IDF environment.
- Build the project.
idf.py build- Flash the board.
idf.py -p (PORT) flash- Open the serial monitor if needed.
idf.py -p (PORT) monitorThis project is my attempt to build a self-balancing robot that does not just work once in simulation, but is verified step by step through SILS, HILS, and real hardware tests using the same ESP32 control code.



