Wave_Engine is a high-performance, mathematically driven UI component built natively for Android. It serves as the real-time visual telemetry HUD for the JARVIS ecosystem, rendering live audio data into a dynamic, 60fps holographic wave.
Unlike standard UI components that rely on heavy external libraries or Machine Learning for simple animations, this engine uses raw calculus and parametric equations to achieve a "military-grade" aesthetic with near-zero UI lag.
- Pure Mathematical Rendering: Uses trigonometric power functions and dynamic radius calculations to map screen coordinates in real-time.
- Modular Architecture (Strategy Pattern): Built heavily on Separation of Concerns. The
AudioSourceinterface allows instant hot-swapping between the hardware microphone and the custom JARVIS Alpha AI sensory module. - Military-Grade Aesthetic: Features logarithmic audio gain scaling, high-frequency mathematical jitter, and OLED-optimized neon shadow rendering.
- Zero-Stutter Performance: Optimized Android
Canvasrendering. By decoupling object initialization from theonDrawloop, it prevents Garbage Collection (GC) churn and maintains a buttery smooth 60 FPS on mobile SoCs.
The core shape is generated by plotting parametric coordinates through a 360-degree cycle. To achieve sharp, aggressive spikes rather than soft sine waves, the engine utilizes an absolute odd-power function combined with high-frequency noise:
// The Math Engine Core
double sinValue = Math.sin(freq * theta);
// Math.pow(abs(sin), 5) creates the sharp, outward-facing spikes
double r = baseRadius + (amplitude * Math.pow(Math.abs(sinValue), 5));
// High-frequency jitter adds a raw, electrical "shimmer" texture
double jitter = (amplitude * 0.1f) * Math.sin(theta * 250);
r += jitter;
// Cartesian mapping for Canvas rendering
float x = (float) (h + r * Math.cos(theta));
float y = (float) (k + r * Math.sin(theta));The codebase is strictly decoupled to allow seamless integration into the wider JARVIS project:
The Contract (AudioSource.kt): The interface defining how audio modules talk to the UI.
The Ears (StandardMicSource.kt / AlphaSource.kt): Captures raw PCM audio data, applies logarithmic scaling (sqrt gain) to expand dynamic range, and emits normalized amplitudes.
The Brain (Wave_Logic.java): A stateless math engine that processes amplitude inputs into geometric Path objects.
The Body (Link_wave_logic.kt): The custom Android View that paints the computed path to the device screen.
Requirements Android Studio (Koala or newer)
Minimum SDK: API 24
Permissions: android.permission.RECORD_AUDIO
Swapping to Alpha Module To upgrade the engine from standard mic processing to the intelligent Alpha module, update the dependency injection in MainActivity.kt:
// FROM: Standard Microphone
private val audioSource: AudioSource = StandardMicSource()
// TO: JARVIS Alpha Module
private val audioSource: AudioSource = AlphaSource(AlphaModule())
Santanu Sarkar
Ai collaboration: Gemini(google) used for debugging assistance
