Motion Analyzer Design

In 2016 I started the conceptual design of a motion analyzer for Apple products, MacOS, iOS, and WatchOS.

The motion that is analyzed is the motion of the Apple Watch during the events of shooting a firearm.

In October 2017 I acquired the necessary hardware and began a preliminary design.
December 2017 prototyping began.
January 2018 the first prototype was taken into the field to collect data.

The Motion Analyzer will use the haptic feedback.

The Motion Analyzer has the following features:

  • Record position data through the use of the gyroscope, accelerometers, and other sensors including the magnetometer and altitude data.
  • Report the following:
    • Time it takes to raise (motion) the firearm into firing position.
    • Time it takes to recover (motion) / get back on target between shots.
    • Angles experienced during the motion caused by the shot.
      • How much the shooters arm / etc., raise (motion) or become out of line (motion) with the target.
      • How much the shooter over corrects (motion) preparing for the next shot.
        • How much time the overcorrection takes
      • Any movement to the left or right, instead of pure up and down.
    • Time between shots
    • Time to go back into the resting position (motion).
    • Three Dimensional positioning of the firearm during the recorded data.
    • Felt Recoil of the firearm reported by G-force experienced during firing.
    • Play back of the data in various ways:
      • 3D visualization of a simulated shooters arm and the movement.
      • Simulated laser and shot indication.
The reporting will be done with data, such as numeric values, 2D graphs, 3D graphs, etc.

For instance, one report will be a simulated laser as if it is coming out of the barrel and will show the movement of the firearm as a dot moving about and an indication of a shot fired.

This is not a shot timer, though the time during the complete shooting activity is included in the data, but the data is not just one dimensional (time between shots).

Below are screen shots from the working prototype. Data was collected from firing 100 rounds through a S&W M&P 9 Shield and another 100 rounds Glock 43.


Data Sets sent from Apple Watch to iPhone



Accelerometer Data over Time



Gyroscope Data over Time


This blog is to capture the time frame of these inventions. This design has been done without any reference to any existing system, that is completely from scratch. While prior art may exist, it is not known of nor has it influenced the design of this system.

The design of the system uses the Apple Watch, which is attached to the person, not the firearm, and is therefore a personal information recording device, recording the motion of the Apple Watch.

The system uses gyroscope data to detect changes in arm position that correlates to recoil caused by the operation of a firearm. The system uses accelerometer data to detect the force involved, which can be used as a value representing "felt recoil" which is unique to the person and not a constant value produced by the system of the firearm and ammunition.

The gyroscope data can be interpreted to show the shooter's arm at their side (at rest), when and how quickly the arm raised and when the arm leveled on to target, the firing of the firearm, and the behavior of the user (the personal information) during recoil and recovery with data gathered from the gyroscope, the accelerometers, the position relative to the pull of gravity, the altitude, magnetometer, GPS, and any future data acquisition abilities added to the Apple Watch that can be used in the analysis of position, forces, and motion.

Analysis of the data will be done through numerical analysis formulas and algorithms, signal processing formulas and algorithms, machine learning, and other techniques.

The machine learning will include data such as but not limited to:

  • Make, Model, and physical characteristics of the firearm
  • Make and details of the ammunition, including the velocity, bullet, powder, primer and all of the physical characteristics of ammunition, such os weights and lengths.
  • Accessories and modifications such as custom grips, finger rests for magazines, etc.
  • Firing grip of the shooter, such as cup and saucer, thumbs forward, etc.
  • Firing stance of the shooter such as isosceles, weaver, rested, prone, etc.
The machine learned data will be used to improve motion data analysis, such as systems that do not produce significant forces, movements, or accelerations, and give meaningful feedback such as "your recovery time between shots is 0.3 seconds faster using the thumbs forward grip with the weaver stance..."


Activation of the application can be done with motions such as raising your arm, changing your stance, tapping or moving the device (any of the ways Apple Watch provide), interacting with the UI, scheduling the device via the Apple Watch timing and alarm capabilities, or via the same or additional interactions with the companion iOS application that is common practice with Watch OS.

Fall 2016 - Concept presented
October 2017 - Hardware required
December 2017 - Prototype Completed
January 2018 - Field Testing


The prototype interface on the Apple watch shows realtime data as it is acquired. The data is sent to the iPhone counterpart application.


The iPhone application will show visualizations of the data acquired, as well as numeric values, and other feedback.

These prototype screenshots do not show the final designs.



Here are some of the designs under current software development.






Motion of watch captured, gimbal lock issues resolved, etc.








Comments

Popular posts from this blog

G Force

Data Filtering