# Model: Weboji
- Powered by Jeeliz's Weboji
- Full technical documentation
- 😀 6DOF head pose estimations
- 😜 11 face morphs and 16 helper states
- 🔌 Comes with "Face Pointer" based plugins
This model uses morphs to help estimate various face states simultaneously and includes assistive tech plugins for browsing pages with face gestures.
# Usage
# With defaults
const handsfree = new Handsfree({weboji: true})
handsfree.start()
# With config
// These are all the default values
handsfree = new Handsfree({
weboji: {
// Whether the model is enabled or not
enabled: false,
// Custom video settings
videoSettings: {
// The video, canvas, or image element
// Omit this to auto create a <VIDEO> with the webcam
videoElement: null,
// ID of the device to use
// Omit this to use the system default
deviceId: null,
// Which camera to use on the device
// Possible values: 'user' (front), 'environment' (back)
facingMode: 'user',
// Video dimensions
idealWidth: 320,
idealHeight: 240,
minWidth: 240,
maxWidth: 1280,
minHeight: 240,
maxHeight: 1280
},
// Thresholds needed before these are considered "activated"
// - Ranges from 0 (not active) to 1 (fully active)
morphs: {
threshold: {
smileRight: 0.7,
smileLeft: 0.7,
browLeftDown: 0.8,
browRightDown: 0.8,
browLeftUp: 0.8,
browRightUp: 0.8,
eyeLeftClosed: 0.4,
eyeRightClosed: 0.4,
mouthOpen: 0.3,
mouthRound: 0.8,
upperLip: 0.5
}
}
}
})
# Data
/**
* {Boolean} Whether the face is detected or not
*/
handsfree.data.weboji.isDetected
/**
* {Array} Face morphs, from 0 (not activated) to 1 (fully activated)
*
* 0: smileRight → closed mouth smile right
* 1: smileLeft → closed mouth smile left
* 2: eyeBrowLeftDown → left eyebrow frowned
* 3: eyeBrowRightDown → right eyebrow frowned
* 4: eyeBrowLeftUp → raise left eyebrow (surprise)
* 5: eyeBrowRightUp → raise right eyebrow (surprise)
* 6: mouthOpen → open mouth
* 7: mouthRound → o shaped mouth
* 8: eyeRightClose → close right eye
* 9: eyeLeftClose → close left eye
* 10: mouthNasty → nasty mouth (show teeth)
*/
handsfree.data.weboji.morphs
/**
* {Array} Head rotation [pitch, yaw, roll]
* - in radians where [0, 0, 0] is the head pointed directly at camera
*/
handsfree.data.weboji.rotation
/**
* {Array} Head rotation [pitch, yaw, roll]
* - in degrees where [0, 0, 0] is the head pointed directly at camera
*/
handsfree.data.weboji.degree
/**
* {Array} Head translation [x, y, s]
* - These are each between 0 and 1
* - Scale refers to the size of the head in relation to the webcam frame
*/
handsfree.data.weboji.translation
/**
* {Object} Where on the screen the head is pointed at {x, y}
* - This is updated by: handsfree.plugin.facePointer
*/
handsfree.data.weboji.pointer
/**
* {Object} Helper booleans checking if the morph has reached a threshold
*
* .smileRight Smirking lips to the right
* .smileLeft Smirking lips to the left
* .smile Smiling equally to both sides
* .smirk Smiling either to the right or left, but not both
* .pursed Kiss face
*
* .browLeftUp Left eyebrow raised up
* .browRightUp Right eyebrow raised up
* .browsUp Both eyebrows raised up
* .browLeftDown Left eyebrow frowning down
* .browRightDown Right eyebrow frowning down
* .browsDown Both eyebrows frowning down
* .browseUpDown One eyebrow down and the other up ("The Rock eyebrows")
*
* .eyeLeftClosed The left eye closed
* .eyeRightClosed The right eye closed
* .eyesClosed Both eyes closed
*
* .mouthClosed
* .mouthOpen
*/
handsfree.data.weboji.state
# API
Please see the Weboji Docs (opens new window) to see available methods exposed through handsfree.model.weboji.api
:
// Check if the head is detected or not
handsfree.model.weboji.api.is_detected()
# Using a pre-recorded video instead of a webcam
By default, setting {weboji: true}
adds a new <video>
element to the DOM to grab the webcam:
handsfree = new Handsfree({weboji: true})
To use a pre-recorded video or video stream, a canvas, or an image instead of a webcam set the .videoSettings.videoElement
property:
handsfree = new Handsfree({
weboji: {
enabled: true,
videoSettings: {
videoElement: document.querySelector('#my-video')
}
}
})
# Examples
The Handsfree.js repo can itself be loaded as an unpacked Chrome Extensions: https://t.co/8RFl3yR0uA
— Oz Ramos (@MIDIBlocks) February 5, 2021
So if you'd like to go that route, all the heavy work is already done for you. Additionally, with WebSockets and Robot.js, you can control your desktop too! pic.twitter.com/m7Xunc0pfq
This newer rewrite does less out the box but will be way more extensible
— Oz Ramos (@MIDIBlocks) November 12, 2020
You can use it with Robot.js or other desktop automation libraries to control your desktop/devices. Here's an older demo of that (will share code to this soon) pic.twitter.com/ShoAwHGGHu
Here's a 30sec video w positioning & smoothing
— Oz Ramos (@MIDIBlocks) December 25, 2020
On the right is my Chrome Dev Tools opened to the #WebXR tab that comes with the Mozilla Emulator Extension with the new Handsfree button 🖐👀🖐
Thanks to @i0nif for the enthusiastic idea & vision! Repo + docs + more after holidays pic.twitter.com/rdV9MIjUBk
Working on a boilerplate for "looking around" an A-Frame handsfree without a VR headset
— Oz Ramos (@MIDIBlocks) December 21, 2020
Going to release this tonight along with a tutorial! Since Handsfree.js is built in a way to support Hot Reload, one idea is to help you work on your 3D projects and look around while coding! pic.twitter.com/NlIMKxgqWT
Handsfree projector w/ funky angles, test #1 🙌
— Oz Ramos (@MIDIBlocks) December 4, 2020
Goal is to see what happens when the surface you want to point at is different than where the webcam is
This is the first step in my implementation of this paper but with projection mapping instead of AR: https://t.co/bflVqmW2RJ pic.twitter.com/YgmYGRETB3
# See also
- Examples
- A-Frame
- Plugins