Robocup SPL 2015 – BEE Final Year Project

Robocup – the World Cup of Robotics, is an international competition held annually, with an aim to promote research in robotics. Its Standard Platform League (SPL) is a soccer competition between teams of fully autonomous humanoid robots. The hardware platform for Robocup SPL is Aldebaran Robotics (now Softbank Robotics) NAO robot. Identical robots instill the competing teams to focus on algorithm and software development.

Team-NUST from Robotics and Intelligent Systems Engineering (R.I.S.E) Research Center at National University of Sciences and Technology, Pakistan was the only team to qualify for the competition from South Asia in Robocup SPL 2015, judged by the quality of work completed at the time of qualification. Unfortunately, sponsorship and funding constraints hindered our ability to participate in the actual competition. Development of software to control a humanoid robot as our Bachelor of Engineering in Electrical Engineering – Final Year Project (Senior Design Project) was an enriching and enlightening experience.

NAO Robot – the hardware platform

Softbank Robotics’ NAO Robot

Robocup SPL has specified the hardware as Softbank Robotics’ NAO humanoid robot. Some of NAO’s salient features are:

  • 25 Degrees of Freedom robot
  • On-board ATOM Z530 CPU and 1 GB RAM
  • NAOqi OS, a Debian based Linux operating system
  • Inertial sensors
  • Dual cameras
  • Infrared and Ultrasonic sensors
  • Multiple microphones for audio localization
  • NAOqi API for application development

The capabilities and development support offered by NAO makes it ideal as a platform for basic to intermediate humanoid robotics research.

System Overview

How does a robotic system designed to play soccer look like ? For a robot to play soccer, the robot uses inputs (camera, motor joint readings and feet pressure sensors in this case), to learn and construe information about its environment. This environment information is processed, keeping in view the robot’s individual and team’s goals, both short term and long term, to decide the the most favorable action. Finally, the kinematics engine translates the intended action to a series of motor movements while ensuring dynamic stability.

Vision – what we see

The primary source of input during the game are the two front-facing cameras on NAO’s face. Field of views of both cameras overlap in a very small region, hence stereoscopic vision is not a viable option. Some of the pertinent landmarks in soccer field are: goal, center circle and corner. These landmarks are detected in the captured image with a degree of confidence associated with each detection, to be used for later stages.

Localization – determining the pose

No overhead cameras in the field force the robot to determine its pose (x, y, θ) in the field like humans do in a football field. Distances from different landmarks are calculated in robot’s local frame. This landmark information is then merged to form an estimate of the robot’s position. Particle filter and Kalman filter were used to localize and then track the robot’s pose in the field as it moves around, giving noisy odometry data.

Intelligence – what to do

This is where most of the ‘thinking’ happens. Information is available regarding the environment, the long term aim of winning the game, the shorter terms aims of scoring a goal, or an even shorter term idea of moving the ball to a more favorable position in the field. All these influences are used to decide the next state the robot should operate in, to drive the robot’s activity in a general direction.

A more granulated control is exercised by state machine’s individual states which fire action commands based on the information pertinent to that state only. For example, a robot operating in “shoot ball” state will work on positioning itself in front of the ball in the required direction before requesting a kick.

Things get even more interesting when some form of communication is allowed within team members. Robots have to keep in mind what the team is aiming to do (attack formation, defensive stance, passing) and try to coordinate in the field towards a common goal. These decision require an AI (Artificial Intellgence) coded into the robot to make decisions on the fly.


This map shows the favorability map of attacker robot’s position considering the pose of opponent goalkeeper.

Kinematic – locomotion

With instructions at hand for executing an action, the robot has to ensure kinematic stability as it carries out the action; which essentially boils down to movement of motor joints at carefully controlled speeds through a calculated trajectory. Two of the most interesting actions are the kick and standing up:


Soccer is incomplete without the robot’s capability to kick the ball with desired speed and direction. Shifting weight to a single leg, followed by positioning the kicking foot in front of the ball before moving it in a calculated trajectory at the required speed executes the kick.

Standing up

Humanoid robots tend to fall every now and then, and the most basic action expected of a humanoid robot is the ability to get back on its feet without external assistance. Furthermore, the standing up as quickly as possible, considering the robot’s posture as it fell down, is of relevance.

Footstep Planning – watch the step

Robot approaching the ball before a kick needs to plan its footsteps carefully to position itself in front of the ball along the required kick direction while avoiding accidentally hitting the ball before intended. In humanoids, the footsteps are planned and loaded in a queue, which is iteratively consumed by the walk engine.


Given the momentum of walk, some immediate footsteps loaded into the walk engine cannot be removed without risking the robot’s fall.

Software Architecture

Software for controlling a real time system, like a humanoid robot, has a set of requirements to begin with:

  • Multiple Threads – various modules have to run in parallel, independant of each other
  • Information Sharing – between various modules
  • Real time constraints – some modules are more time-critical than others
  • Remote Monitoring and Debugging – speed up development of modules in an looping embedded system

The overall system design can be simplified as shown below:


The architecture had to expose feedback loops with different levels of urgency. Reactive feedback was required for some actions (positioning the robot in front of the ball, for example), while some activities required a more deliberate feedback after planning and processing the possibilities (determining the most favorable position of robot in field, for example). The feedback loops were achieved by a layered design, with configurable interconnects between them.

The state machine, perception and planning layers were able to adequately share data using a common shared scratch memory. However, the link between state machine and actuation layer had to be event based to ensure certain actions were triggered exactly once. This requirement was realized by implementation of message queue between the state machine and the actuation layer.

Control Panel

Considering the requirement of Remote Monitoring and Debugging, a remote logging system was part of the design from start. Methods were implemented for delivering the system snapshot to the development host. This received snapshot was illustrated on a Graphical User Interface developed using Java Swing API. Some of the screenshots of possible views are shared below:


A live video feed (the image to stream was configurable on run time) is displayed in the control panel. All the variables located in scratch memory were visible and modifiable from the GUI.


Some pieces of realtime information received was best illustrated on a soccer field view, as shown above.

Development Environment

Overtime the development environment was evolved to speed up the process. Unaccustomed to development on Linux back then (how things have changed now), I resorted to using Microsoft Visual Studio as the primary IDE. The NAO SDK, however, provided a cross-compiler which required a Linux host. The generated binaries had to be transferred to the robot over WiFi and invoked via SSH.


The process of requesting a build from Visual Studio, followed by copying source code to a Ubuntu machine for cross-compilation, finalized by copying the generated binaries over WiFi and their subsequent invocation was automated using my first ever bash script.

Lessons Learnt

The project was a learning experience enhancing both my technical and soft skills.

  • This was my first experience leading a team of peers for a project spanning several months.
  • This project was a crash course to robotics; spanning topics related to vision, localization, kinematics, artificial intelligence and software design.
  • Defining technical dependencies and expectations for modules is important. The different modules of the project were highly interdependent and effective collaboration was imperative.
  • Modelling of physical systems requires consideration of error and confidence from bottom upward.
  • Presentation of technical work requires results to be recorded for all experiments conducted and notes taken throughout the course of the project.
  • Methods for observing the remote target system state at any time, as unobtrusively as possible, pays off when debugging the system.
  • Investment of time in streamlining and unifying the development environment and process pays in the longer run.

Some of the points which became clearer (usually too little too late) as the project progressed. I consider these to be the real lessons learnt:

  • Research of existing solutions, open source or otherwise, needs its due attention in the early stages to avoid reinventing the wheel.
  • Development of modules should be accompanied by test cases which can benchmark the module’s effectiveness and performance.
  • Any project spanning more than a few weeks should employ a code versioning system, especially if it involves a multi-person team.

This project was one of the few instances of programming where the output was so readily yet strikingly visible in physical form, as actions carried out by the humanoids were governed by our code. Good times… 🙂


This was a very cursory overview of the expectations and basic implementation directions adapted for programming NAO robots for Robocup Standard Platform League. For a more detailed technical write up of our efforts, see the publications section below.

A huge shout out to the team and our supervisors ! 🙂



Our project was covered by a number of national media outlets for being the only team to qualify in Robocup SPL from South Asia.


Leave a Reply

Your email address will not be published. Required fields are marked *