carp_mobile_sensing 0.10.0+5 copy "carp_mobile_sensing: ^0.10.0+5" to clipboard
carp_mobile_sensing: ^0.10.0+5 copied to clipboard

outdated

Mobile Sensing Framework for Flutter. A software framework for collecting sensor data from the phone and attached wearable devices via probes. Can be extended.

CARP Mobile Sensing Framework in Flutter #

pub package style: effective dart github stars MIT License

This library contains the core Flutter package for the CARP Mobile Sensing (CAMS) framework. Supports cross-platform (iOS and Android) mobile sensing.

For an overview of all CAMS packages, see CARP Mobile Sensing in Flutter. For documentation on how to use CAMS, see the CAMS wiki.

Usage #

To use this plugin, add carp_mobile_sensing as a dependency in your pubspec.yaml file.

Android Integration #

Add the following to your app's manifest.xml file located in android/app/src/main:

<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="<your_package_name"
    xmlns:tools="http://schemas.android.com/tools">

   ...
   
    <!-- The following permissions are used for CARP Mobile Sensing -->
    <uses-permission android:name="android.permission.PACKAGE_USAGE_STATS" tools:ignore="ProtectedPermissions"/>
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
    <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />

</manifest>

NOTE: Other CAMS sampling packages require additional permissions in the manifest.xml file. See the documentation for each package.

NOTE: Version 0.5.0 is migrated to AndroidX. It requires any Android apps using this plugin to also

migrate if they're using the original support library. See Flutter AndroidX compatibility

iOS Integration #

The pedometer (step count) probe uses NSMotion on iOS and the NSMotionUsageDescription needs to be specified in the app's Info.plist file located in ios/Runner:

  <key>NSMotionUsageDescription</key>
  <string>Collecting step count.</string>

Documentation #

The Dart API doc describes the different libraries and classes.

The wiki contains detailed documentation on the CARP Mobile Sensing Framework, including the domain model, how to use it by create a Study configuration, how to extend it, and an overview of the different Measure types available.

A more scientific documentation of CAMS is available at arxiv.org:

  • Bardram, Jakob E. "The CARP Mobile Sensing Framework--A Cross-platform, Reactive, Programming Framework and Runtime Environment for Digital Phenotyping." arXiv preprint arXiv:2006.11904 (2020). [pdf]
@article{bardram2020carp,
  title={The CARP Mobile Sensing Framework--A Cross-platform, Reactive, Programming Framework and Runtime Environment for Digital Phenotyping},
  author={Bardram, Jakob E},
  journal={arXiv preprint arXiv:2006.11904},
  year={2020}
}

Please use this as a reference in any scientific papers using CAMS.

Examples #

In CAMS, sensing is configured in a Study object and sensing is controlled by a StudyController.

Below is a simple example of how to set up a study that sense step counts (pedometer), ambient light (light), screen activity (screen), and power consumption (battery). This data is stores as json to a local file on the phone.

// Import package
import 'package:carp_mobile_sensing/carp_mobile_sensing.dart';

void example() async {
  // Create a study using a local file to store data
  Study study = Study("2", 'user@cachet.dk',
      name: 'A study collecting ..',
      dataEndPoint: FileDataEndPoint()
        ..bufferSize = 500 * 1000
        ..zip = true
        ..encrypt = false);

  // Add an automatic task that immediately starts collecting
  // step counts, ambient light, screen activity, and battery level
  study.addTriggerTask(
      ImmediateTrigger(),
      AutomaticTask()
        ..measures = SamplingSchema.common().getMeasureList(
          namespace: NameSpace.CARP,
          types: [
            SensorSamplingPackage.PEDOMETER,
            SensorSamplingPackage.LIGHT,
            DeviceSamplingPackage.SCREEN,
            DeviceSamplingPackage.BATTERY,
          ],
        ));

  // Create a Study Controller that can manage this study.
  StudyController controller = StudyController(study);

  // await initialization before starting/resuming
  await controller.initialize();
  controller.resume();

  // listening and print all data events from the study
  controller.events.forEach(print);
}

The above example make use of the pre-defined SamplingSchema named common. This sampling schema contains a set of default settings for how to sample the different measures.

Sampling can be configured in a very sophisticated ways, by specifying different types of triggers, tasks, and measures - see the CAMS domain model for an overview. In the following example, a study is created "by hand", i.e. you specify each trigger, task and measure in the study.

void example() async {
  // Create a study using a local file to store data
  Study study = Study("1234", "user@dtu.dk",
      name: "An example study",
      dataEndPoint: FileDataEndPoint()
        ..bufferSize = 500 * 1000
        ..zip = true
        ..encrypt = false);

  // automatically collect accelerometer and gyroscope data
  // but delay the sampling by 10 seconds
    study.addTriggerTask(
      DelayedTrigger(delay: Duration(seconds: 10)),
      AutomaticTask(name: 'Sensor Task')
        ..addMeasure(PeriodicMeasure(
            MeasureType(NameSpace.CARP, SensorSamplingPackage.ACCELEROMETER),
            frequency: const Duration(seconds: 5),
            duration: const Duration(seconds: 1)))
        ..addMeasure(PeriodicMeasure(
            MeasureType(NameSpace.CARP, SensorSamplingPackage.GYROSCOPE),
            frequency: const Duration(seconds: 6),
            duration: const Duration(seconds: 2))));

  // create a light measure variable to be used later
  PeriodicMeasure lightMeasure = PeriodicMeasure(
    MeasureType(NameSpace.CARP, SensorSamplingPackage.LIGHT),
    name: "Ambient Light",
    frequency: const Duration(seconds: 11),
    duration: const Duration(milliseconds: 100),
  );
  // add it to the study to start immediately
  study.addTriggerTask(ImmediateTrigger(),
      AutomaticTask(name: 'Light')..addMeasure(lightMeasure));

// Create a Study Controller that can manage this study.
  StudyController controller = StudyController(study);

  // await initialization before starting/resuming
  await controller.initialize();
  controller.resume();

  // listening on all data events from the study
  controller.events.forEach(print);

  // listen on only CARP events
  controller.events
      .where((datum) => datum.format.namespace == NameSpace.CARP)
      .forEach(print);

  // listen on LIGHT events only
  controller.events
      .where((datum) => datum.format.name == SensorSamplingPackage.LIGHT)
      .forEach(print);

  // map events to JSON and then print
  controller.events.map((datum) => datum.toJson()).forEach(print);

  // listening on a specific event type
  // this is equivalent to the statement above
  ProbeRegistry().eventsByType(SensorSamplingPackage.LIGHT).forEach(print);

  // subscribe to events
  StreamSubscription<Datum> subscription =
      controller.events.listen((Datum datum) {
    // do something w. the datum, e.g. print the json
    print(JsonEncoder.withIndent(' ').convert(datum));
  });

  // sampling can be paused and resumed
  controller.pause();
  controller.resume();

  // pause specific probe(s)
  ProbeRegistry()
      .lookup(SensorSamplingPackage.ACCELEROMETER)
      .forEach((probe) => probe.pause());

  // adapt measures on the go - calling hasChanged() force a restart of
  // the probe, which will load the new measure
  lightMeasure
    ..frequency = const Duration(seconds: 12)
    ..duration = const Duration(milliseconds: 500)
    ..hasChanged();

  // disabling a measure will pause the probe
  lightMeasure
    ..enabled = false
    ..hasChanged();

  // once the sampling has to stop, e.g. in a Flutter dispose() methods, call stop.
  // note that once a sampling has stopped, it cannot be restarted.
  controller.stop();
  subscription.cancel();
} 

There is a very simple example app app which shows how a study can be created with different tasks and measures. This app just prints the sensing data to a console screen on the phone. There is also a range of different examples on how to create a study to take inspiration from.

However, the CARP Mobile Sensing App provides a MUCH better example of how to use the package in a Flutter BLoC architecture, including good documentation of how to do this.

Features and bugs #

Please read about existing issues and file new feature requests and bug reports at the issue tracker.

License #

This software is copyright (c) Copenhagen Center for Health Technology (CACHET) at the Technical University of Denmark (DTU). This software is available 'as-is' under a MIT license.

21
likes
0
pub points
76%
popularity

Publisher

verified publishercachet.dk

Mobile Sensing Framework for Flutter. A software framework for collecting sensor data from the phone and attached wearable devices via probes. Can be extended.

Repository (GitHub)
View/report issues

License

unknown (license)

Dependencies

archive, async, battery, cron, device_apps, device_info, flutter, json_annotation, light, meta, package_info, path_provider, pedometer, permission_handler, permission_handler_platform_interface, screen_state, sensors, shared_preferences, stats, system_info, uuid

More

Packages that depend on carp_mobile_sensing