The Augmented Human
At GoodAI, we are building collaborative AI agents that enhance human capabilities and drive positive change at scale. Collective intelligence is the guiding principle behind our work. One of our teams is dedicated to developing AI agents, such as Large Language Model-based AI assistants, programmers, researchers, and a Stoic Mentor. Simultaneously, another team is developing the GoodAI Groundstation platform that allows the agents to operate in the physical world.
GoodAI Groundstation empowers individuals to control multiple robots simultaneously without specialized training. Users provide high-level goals, and the robots carry out tasks autonomously. A team member’s engagement in community safety gave us our first real-life use case in crime prevention, and a motivation to create a tool to democratize safety. While in many cases still a luxury, we believe that safety should be a human right.
Teach The AI Agents What You Need
With every milestone, we move more towards the adaptability of the system. Instead of relying on rigid, hard-coded features, the user will teach the system to operate in specific use cases. Our AI agents have already learned to use the Groundstation API, guided merely by natural language instructions from the developers. Watch the user teach our AI agent to fly the drones in specific ways:
You can read more about how we teach LLM agents to fly drones here.
Some of the key functionalities of GoodAI Groundstation, tested in real-life conditions and ready for deployment, currently include:
- Remote operations without manual piloting, or pilot on the ground needed
- Autonomous patrolling missions
- Object detection: persons, vehicles
- Autonomous object tracking
- Drone collaboration: detections from one drone are used to create tasks for another drone instantly. For example, to take a closer look at an object detected during an earlier patrol mission
- “Panic button”: instantly sends a drone to investigate an object located on the map. Triggered by operator’s click on a detected object on the map, or by an external device, such as a panic button app, a camera, a motion sensor, etc.
- Autonomous take-off and landing
The UI allows you to run complex missions with a few clicks. Note that for the reasons of safety, the human is always in the loop to confirm the next action of the drone.
You can watch our recent GoodAI Groundstation live demos in South Africa below.
Drone patrol mission:
Drone follows person:
Next Steps: Harnessing the Power of Collectives
We aim to leverage the power of collectives further and build a robust, efficient, and flexible tool. Instead of relying on expensive specialized hardware, we aim to harness the collective’s emergent capabilities for cooperation to achieve goals.
We will focus on swarms of small drones, which rely on sensor fusion and collaborative problem-solving in dynamically changing environments and hardware configurations. Imagine a group of inexpensive mini-drones as a safe, lightweight, and user-friendly setup instead of one all-encompassing but large and demanding drone.
We will also involve our AI agents more heavily, allowing them to learn from the user and operate within the Groundstation via real-time conversations in natural language.
We are building the GoodAI Groundstation as a versatile platform suitable for any robot or agent. Picture someone creating a makeshift cardboard drone and easily integrating it with our Groundstation with the help of AI agents, ready to take flight. Envision users customizing our Groundstation, adding functionalities, incorporating modules, and teaching AI agents to adapt it for specific purposes. Ultimately, we see the Groundstation as an open-ended ecosystem that accommodates everything from existing drones to future nanobots and beyond, augmenting human capabilities in both the digital and physical realms.
If you would like to become part of the GoodAI collective and build collaborative AI agents and their “playground”, the GoodAI Groundstation, with us, join us on our journey.