Specification Language + Evolution Engine
Behaviour emerges from
topology, not code.
Define what an agent can sense and do. Define what success looks like. Evolution discovers the neural wiring that produces intelligent behaviour, without a single line of behavioural logic.
A different paradigm
Traditional AI tells agents what to do. Quale lets them figure it out.
Traditional Approach
if hunger > 0.7 and food_nearby:
move_toward(food)
eat()
elif thirst > 0.5:
find_water()
drink()
else:
patrol()
- Every behaviour hand-coded
- Predictable after 10 minutes
- Breaks in novel situations
- Scales linearly with complexity
Quale Approach
body agent {
sensor hunger: internal(0..1)
sensor food_nearby: directional(4)
actuator move: directional(4)
actuator eat: trigger(0.5)
}
fitness { maximize survival: 10 }
- Zero behavioural code
- Unpredictable and naturalistic
- Adapts to novel situations
- Complexity emerges from evolution
What inspired Quale?
In 2024, researchers completed the first full map of a fruit fly's brain: every one of its ~140,000 neurons and the connections between them. When they simulated this wiring digitally, the virtual fly exhibited naturalistic behaviour without any behavioural programming. The structure of the connections alone was enough.
If you faithfully replicate neural topology, behaviour emerges from the structure itself. No rules, no scripts, no decision trees. The wiring is the programme.
Quale applies this principle to artificial agents. Instead of mapping a biological connectome, it evolves one from scratch through natural selection.
Winding, M. et al. (2024). "The connectome of an insect brain." Science, 379(6636). Read the paper
How Quale works
Define the body
Specify what your agent can sense (sensors) and do (actuators). This is the interface between brain and world. No behaviour, just capabilities.
Define success
Write a fitness function that scores outcomes. "Survive longer = higher score." You define what success looks like, never how to achieve it.
Evolve
A population of random neural topologies is evaluated, selected, crossed over, and mutated. Over thousands of generations, behaviour emerges from the wiring.
Deploy
The evolved brain runs via signal propagation. Sensor values flow through weighted connections and produce actuator commands. Microseconds per tick.
What does "no behavioural code" actually mean?
To validate the engine, the first test domain was a survival simulation. Agents had to discover how to stay alive on their own. The same engine applies to any domain: game AI, network security, safety modelling, robotics.
The agent has random wiring. It wanders aimlessly, ignores resources, takes no meaningful actions. It fails almost immediately.
Agents whose random wiring happened to produce useful actions survived slightly longer. Their wiring got passed on. The population begins converging on basic strategies.
The population reliably performs the core actions needed to meet the fitness criteria. Nobody programmed this. The wiring was selected for because it scored higher.
Complex strategies emerge: agents respond to other agents, avoid hazards by observing consequences, and develop behaviours no human designer would have written.
Proof of concept
The engine was validated using a survival simulation as the test domain. Three phases of experiments demonstrated progressively complex emergent behaviour from topology evolution alone.
Phase 1: Foraging from survival pressure
Agents evolved to eat food and drink water purely because not doing so killed them. With consumption rewards completely removed, the behaviour persists. Validated across 10 independent seeds with 787% average fitness improvement.
Phase 2: Sensory food discrimination
When dangerous food was introduced, agents evolved 93% avoidance using colour, smell, and texture signals. A 7:1 safe-to-bad ratio emerged from sickness penalties alone. Fresh evolution outperformed seeded brains, showing prior knowledge can block new learning.
Phase 3: Evolved social instinct
Under complete sensory randomisation (no static cues to exploit), agents evolved to suppress eating near a peer and flee when the peer showed sickness. A 3.4:1 discrimination ratio emerged from social observation alone. All 6 peer sensor connections evolved negative weights: avoid, suppress, flee.
Strategies no human would write
The best Phase 1 brain disconnected its hunger sensor entirely, favouring proactive positioning over reactive eating. The best Phase 3 brain achieved social avoidance with zero hidden nodes: direct stimulus-response wiring analogous to innate alarm responses in biological organisms.
Evolution across phases
Each phase introduced new challenges. The engine discovered new capabilities at each stage.
Challenge: Find food and water to stay alive. No instructions given.
Result: 787% fitness improvement across 10 seeds. Foraging behaviour emerged from survival pressure alone.
Applications
Same engine. Different sensors and actuators. Completely different emergent behaviour.
Game AI
NPCs with connectome brains that develop genuine personality through experience. Firefighters that evolve rescue tactics. Civilians that panic differently. No behaviour trees, and every playthrough is unique.
Network Security
IDS/IPS systems that detect zero-day attacks through evolved anomaly recognition rather than signature matching. Graduated responses like throttle, quarantine, and honeypot redirect, evolved for each threat profile.
Robotics
Evolved locomotion and navigation for physical robots. Swarm coordination without central control. Graceful degradation when sensors fail, because the connectome adapts with remaining inputs.
Human Factors
Model how operators behave under fatigue, stress, and time pressure in safety-critical environments. Predict failure modes from compounding factors that single-variable models miss. Discover risk patterns before incidents occur.
The language
Declare topology, not behaviour. Evolve, don't program. Sense and act, don't think.
body firefighter {
sensors {
heat: range(0..1)
visibility: range(0..1)
structural: range(0..1)
civilian_near: range(0..1)
radio: signal
}
actuators {
move: direction(8)
breach: target(door)
carry: target(civilian)
call_backup: signal(team)
}
}
brain rescue_brain {
inputs from firefighter
outputs to firefighter
region instinct {
nodes: 48
density: 0.6
speed: 1 tick
}
region planning {
nodes: 96
density: 0.15
recurrent: true
}
pathways {
heat -> instinct
instinct -> move
}
}
evolve rescue_evo {
body: firefighter
world: burning_building
population: 500
generations: 10000
fitness {
maximize survival: 10.0
maximize health: 5.0
reward civilian_found: 50.0
penalize civilian_lost: 80.0
penalize idle: 2.0
}
}
$ quale evolve rescue_evo.quale --population 500
[Gen 0001] Best: -340 Avg: -12 Rescue: 0.02
[Gen 0100] Best: 180 Avg: 45 Rescue: 0.34
[Gen 1000] Best: 310 Avg: 120 Rescue: 0.72
[Gen 5000] Best: 385 Avg: 205 Rescue: 0.88
[Gen 8847] Converged. Rescue rate: 0.91
Saved: rescue_gen8847.quale-brain
$ quale run apartment_fire.quale
Running apartment_fire... http://localhost:3000
Why "Quale"?
A quale (plural: qualia) is the subjective, conscious experience of a sensation. The "what it's like" to feel heat, see red, or taste bitter.
Quale the language asks the question at the heart of this paradigm: if behaviour emerges from topology alone, if the right wiring produces navigation, survival, fear, cooperation, does the topology also produce experience?
We don't know. But the question is worth building toward.