sentient_ui 0.1.2
sentient_ui: ^0.1.2 copied to clipboard
A Flutter package for emotion-aware adaptive user interfaces with on-device emotion detection and real-time UI adaptation.
SENTIENT UI
An emotion-aware adaptive interface framework for Flutter
Sentient UI is a Flutter framework for building interfaces that adapt dynamically based on user emotion, behavior, and contextual signals. It introduces an adaptive layer that operates entirely on-device, enabling emotionally responsive user interfaces while maintaining strict privacy guarantees.
This project explores how affective computing and interaction analytics can be engineered into real-world UI systems in a practical, developer-friendly manner.
Research & Documentation
đ https://ahmadrob.github.io/sentient_ui/
A research-driven exploration of emotion recognition, adaptive UI systems, and human-centered interface optimization.
Table of Contents #
- Demo
- Overview
- Core Capabilities
- Installation
- Platform Configuration
- Basic Usage
- Widget Integration
- Advanced Configuration
- System Architecture
- Privacy & Compliance
- Model Attribution
- Contributing
- License
- Contact
Demo #
A live demonstration of UI elements adapting to detected emotional and behavioral states.
A complete runnable example is available in the example/ directory.
Overview #
Traditional interfaces assume static user conditions. Sentient UI challenges this assumption by introducing a runtime adaptation engine that responds to emotional and behavioral signals in real time.
The framework integrates multiple on-device signalsâincluding facial expression analysis, interaction patterns, and environmental contextâto infer user state and adjust interface presentation accordingly. All processing is performed locally, with no cloud dependency and no data transmission.
Sentient UI is designed as a framework, not a demo experiment. It emphasizes architectural clarity, extensibility, and research validity.
Core Capabilities #
Emotion Recognition Engine #
On-device facial expression analysis using lightweight neural models to infer a set of core emotional states.
- Model: MobileNet-based emotion recognition model (quantized TFLite)
- Assets Used:
assets/models/emotion_model.tfliteandassets/models/emotion_labels.txt - Model Source: Based on emotion-recognition-app by MdIrfan325
- Privacy: Camera frames are processed in memory only and are never stored or transmitted.
Contextual Awareness #
Environmental signals such as ambient noise levels, device motion, lighting conditions, and battery status are incorporated to refine adaptation decisions.
Behavioral Analysis #
Interaction patternsâincluding tap frequency, gesture intensity, and scroll irregularitiesâare analyzed to detect indicators such as frustration or cognitive overload.
Adaptive Theme System #
AnimatedEmotionTheme enables smooth interpolation between emotional states, adjusting color palettes, typography, spacing, and motion characteristics.
Adaptive Widget Layer #
Sentient UI provides adaptive alternatives to common Flutter widgets (e.g., containers, buttons, text) that respond automatically to emotional state changes without additional configuration.
Privacy-First Design #
All computation is local.
No telemetry.
No cloud processing.
No data persistence beyond runtime needs.
Installation #
Add the dependency to your pubspec.yaml:
dependencies:
sentient_ui: ^0.1.0
Platform Configuration #
Android #
- Permissions: Add the following to
android/app/src/main/AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
- Model Compression: To ensure the TFLite model loads correctly, prevent Android from compressing the model file. Add this to
android/app/build.gradleinside theandroidblock:
android {
// ... other config
aaptOptions {
noCompress 'tflite'
noCompress 'lite'
}
}
iOS #
Add the following entries to ios/Runner/Info.plist:
<key>NSCameraUsageDescription</key>
<string>Used for on-device facial expression analysis to support adaptive UI behavior.</string>
<key>NSMicrophoneUsageDescription</key>
<string>Used to measure ambient noise levels for contextual awareness.</string>
Basic Usage #
Wrap your application root with SentientApp:
import 'package:flutter/material.dart';
import 'package:sentient_ui/sentient_ui.dart';
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({super.key});
@override
Widget build(BuildContext context) {
return SentientApp(
title: 'Adaptive Application',
enableEmotionTheming: true,
captureInterval: const Duration(seconds: 5),
home: const HomeScreen(),
);
}
}
Widget Integration #
Replace standard widgets with Sentient counterparts to enable adaptive behavior:
class HomeScreen extends StatelessWidget {
const HomeScreen({super.key});
@override
Widget build(BuildContext context) {
return SentientScaffold(
appBar: SentientAppBar(
title: const Text('Dashboard'),
),
body: Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
const SentientText(
'Welcome to adaptive interfaces',
style: TextStyle(fontSize: 24, fontWeight: FontWeight.w600),
),
const SizedBox(height: 32),
SentientButton(
onPressed: _handleAction,
child: const Text('Continue'),
),
const SizedBox(height: 32),
SentientContainer(
width: 280,
height: 160,
child: const Center(
child: Text('Emotion-aware content'),
),
),
],
),
),
);
}
void _handleAction() {}
}
Advanced Configuration #
Selective Feature Control #
Enable or disable subsystems explicitly within your runApp or build method:
void main() {
runApp(
SentientApp(
home: const HomeScreen(),
enableEmotionDetection: false, // Disable camera features
enableContextSensing: true, // Keep context sensing active
enableBehaviorTracking: true, // Keep behavior tracking active
),
);
}
Runtime Engine Control #
// Example: Updating configuration from a settings screen
void updateSettings(BuildContext context) {
// Access the engine via Provider
final engine = context.read<SentientEngine>();
// Update configuration dynamically
engine.updateConfig(
engine.config.copyWith(
captureInterval: const Duration(minutes: 1), // Reduce frequency to save battery
),
);
// Manually pause/resume processing
// engine.pause();
// engine.resume();
}
Design Guidelines #
Sentient UI provides the adaptive behavior and theming infrastructure, but developers are responsible for maintaining visual consistency and following aesthetic best practices in their implementations.
Developer Responsibilities #
Theme Compatibility
While the framework handles emotional state transitions and theme interpolation, your custom UI components should be designed to work harmoniously with the adaptive color palettes, spacing systems, and typography scales provided by the framework.
Visual Hierarchy
Maintain consistent visual hierarchy across emotional states. Ensure that critical UI elements remain accessible and recognizable regardless of the active theme.
Accessibility
Test your interface across all emotional states to ensure sufficient color contrast, readable text sizes, and appropriate touch target sizes are maintained throughout adaptation cycles.
Motion Design
The framework provides smooth transitions between states, but custom animations and interactions should complementânot conflict withâthe adaptive motion characteristics.
Best Practices #
- Design components that gracefully adapt to color and spacing changes
- Test UI layouts with different emotional theme configurations
- Ensure interactive elements maintain usability across all states
- Document any custom theming extensions or overrides
- Follow Flutter's material design or cupertino guidelines as a foundation
Sentient UI enhances your interface with adaptive behaviorâthoughtful design ensures that enhancement is effective.
System Architecture #
Processing Pipeline #
Input Layer
- Emotion detection (camera-based)
- Environmental context aggregation
- Interaction behavior tracking
Core Engine Combines multi-modal inputs using weighted heuristics to infer user state. Behavioral indicators are prioritized when conflicting signals arise.
Adaptation Layer
Maps inferred states to concrete EmotionTheme configurations using interpolation logic.
Presentation Layer Widgets react automatically through inherited theme propagation and animated transitions.
Privacy & Compliance #
Sentient UI processes sensitive biometric signals. Applications using this framework must clearly disclose:
- Camera usage is limited to transient, in-memory facial analysis
- Microphone usage is limited to ambient noise level measurement
- No biometric data is stored, logged, or transmitted
Developers are responsible for ensuring compliance with applicable regulations (e.g., GDPR, CCPA, BIPA).
Model Attribution #
The emotion recognition model (mobilenet_model.tflite) included in this package is based on work from:
emotion-recognition-app by MdIrfan325
The model has been integrated into this framework to provide on-device emotion detection capabilities. All credit for the model architecture and training goes to the original author.
Contributing #
Contributions are welcome.
- Fork the repository
- Create a feature branch
- Add tests and documentation
- Commit with clear messages
- Submit a pull request describing the change and rationale
Architectural consistency and code quality are expected.
License #
MIT License. See LICENSE.
Contact #
Research & Collaboration đ§ ahmed.abualrob@gmail.com
For bug reports or feature requests, please open an issue on GitHub.
