top of page
ad background.png

Face Tracking & VBridger

Updated: 10 hours ago

The most important step in vtubing is configuring your face tracking, not only to suit your features and expressions, but also to make piloting more effortless. A vtuber is like a handmade suit; they can be beautiful and impressively crafted, but if you aren't tailoring them to fit you, then all that detail you just paid for is wasted!


Jump to:



Camera vs Parameters

There are two sections that you can edit in Vtube Studio: The tracking sensitivity of your camera, and the sensitivity of individual parameters(?). In this guide we're going to adjust the parameters first. Then, once your model is fine tuned for your face, you can choose to adjust the camera settings if you want to make pulling certain expressions easier on your muscles.


Understanding Parameter Settings

Your parameters can be found in the Model tab of the Settings menu. Each parameter will be labelled in a box like this.

ree

  • Input: The IRL movement that is being tracked

  • Output: The model's movement that has been rigged

  • Smoothing: Limits the movement to create a smoother transition.

  • Auto-Blinking: Overrides tracking to trigger the parameter on a frequent, random basis.

  • Auto-Breathing: Overrides tracking and animates the parameter at a constant pace.

  • IN: The sensitivity of the input being tracked.

  • OUT: The range of motion for the output.


To better understand what all these numbers mean, think of a movement like nodding your head up and down: AngleY. Looking straight ahead is the baseline of 0. The maximum range would be lifting your head as far as it can go, which translates to +1. That makes looking down the absolute minimum, -1.


A blink, on the other hand, isn't a symmetrical movement like moving your head; your eye is either open or it's closed. In that case, the maximum range of EyeOpen is 1 when the eye is open, and 0 when the eye is closed.


ree
ree


How To Tune Your Tracking

Fine tuning your face tracking is fiddly, but relatively simple: For each parameter you want to edit, you'll pull that face IRL to see how close you can get to the maximum IN number. If you fall a little short, you'll adjust the maximum to match whatever number you hit instead.


You can also adjust the number range however you want, like increasing the sensitivity so you don't have to move your head as far in order for the model to hit its maximum ranges. This can be a great way to make piloting less fatiguing, or to make your model seem more expressive if you're a naturally subdued person. The reverse works too-- if you tend to smile a lot, but you want to present a character who is cold and broody, you can adjust your numbers so that even the widest smile barely registers to the model.


Example: EyeSquint

(This particular parameter is a feature of VBridger, so we'll need an iOS camera and the VBridger plugin installed!)

  1. Make sure your iOS camera is connected, and VBridger is running.

  2. In VTube Studio, open the Model tab of the Settings menu and scroll down to the parameter you want to edit.

    • You can quickly find them by hovering over the folder icon and selecting a category (eyes, mouth etc), or typing something (eye, mouth etc) in the search bar.

  3. This parameter is triggered when we squint our eyes, like when you're concentrating or smiling super wide. Check the IN readout when you're squinting as much as is comfortable.

ree


The IN Numbers:

The top field is the maximum range of motion, the bottom field is the minimum, and the number on the left is what the camera is currently tracking.

  1. When you're squinting as hard as you can, the left number is your personal maximum.

  2. Copy that into the top text field, and this will become the new maximum for this expression. Now your camera will know that this is the most we can squint, so the model should be squinting at 100% whenever we pull this face.


In this example, even though I was squinting as much as possible, I could only get up to 0.80 on the readout. You can see how this affects the model by looking at the OUT section: The model is outputting 0.79, or 79% of the expression that has been rigged. It thinks I’m only partially squinting, so it’s making the model partially squint too, and we're wasting 20% of potential expressiveness by leaving this as is!

ree

Now, after changing the IN maximum to 0.80 to match my face, the model is outputting 1.00, or 100% of the expression.

ree

Adjusting Minimum Inputs

This works the same for the bottom number, which is important for making sure your neutral resting face isn’t triggering any unwanted expressions. 


Here, I’m pushing MouthFunnel to the limit to make sure the model doesn’t funnel when I’m not intending to. MouthFunnel occurs with "sh" and "ch" sounds, as well as words like "you" where your lips open by the corners of your mouth stay shut.


In this test, I’m pursing my lips in a kissing shape, but making sure my mouth stays completely shut. The camera is reading my lip shape as an open funnel, making the VTuber open its mouth and show teeth even though my mouth is closed IRL.

ree

I use the same process as before, checking the number on the left and inputting it in the minimum IN field. Now the camera knows this is the new baseline, and the model's mouth stays shut when it’s supposed to.

ree


Cheatsheet: Parameter Expression Examples

Here's an easy list of the parameters you should prioritise testing and tweaking, as well as the faces you'll need to pull to trigger them. VBridger-only expressions will be marked with a 🟣. Please forgive my close ups!!


Mouth Smile

With your mouth closed, smile (maximum) and draw the corners of your mouth downwards (minimum)


ree

Mouth X (iOS only)

Purse your lips from side to side.


ree

Mouth Open

This reads your lips. Make sure the minimum number sits no higher than 0 when your lips are shut, and the maximum hits 1 when your mouth is as wide open as you can get it.


ree

🟣 Jaw Open

This reads your jawline. Make sure the teeth stay together when grinning (minimum), and that the maximum hits 1 when your mouth is open as wide as you can get it. Also check opening your jaw as wide as you can when your lips are still together (like when chewing).


ree

🟣 Brow Inner Up

Pinch the middle of your brows up without raising them.


ree

🟣 Mouth Funnel

With your teeth together, open your mouth in a “ch” sound, and double check the expression by opening your mouth wide in a pog shape.


ree

🟣 Mouth Shrug

Push your mouth up in a pout.


ree

🟣 Mouth Pucker Widen

Pucker your lips for a kiss (minimum), then widen them in a big smile (maximum)


ree

🟣 Mouth Press Lip Open

Roll your lips inwards (minimum), and bare your teeth with your lips open as much as possible (maximum)


ree


Creating Expression Hotkeys

As a final note, it's good to remember that you can make expressions out of any parameter combination you'd like. Some faces and movements are physically demanding, and can either be jittery no matter how much you change the sensitivity, or just plain exhausting after a while. It can be a good idea to set some of these to hotkeys, so you can toggle them without having to physically make the face.


This section assumes you've read the Quick Start guide on setting up hotkeys.

  1. Open the Hotkeys tab in the Settings menu.

  2. Select the Expression Editor. From here you can edit or create new expressions.

    ree
    ree
  3. With a new or existing expression open, you’ll be able to see the full range of parameters that I’ve rigged for this model. Have fun making expressions by enabling and disabling different options! Remember that anything ticked in this will override your facial tracking when the toggle is active, even if the value is set to 0.

    ree
  4. Assign your new expression to a hotkey to be able to toggle it: Scroll down the list until you see the PLUS + icon to create a new hotkey.

  5. Select Hotkey Actions and choose Set/Unset Expression from the list.

  6. Select Expressions and choose your desired expression from the list.

    ree
  7. Optional: Record a new Key Combination, or create a new Stream Deck button to trigger this hotkey.

  8. Optional: Adjust how quickly this expression activates by changing the Fade for Sec. number. Speeding it up to 0.2-3 can feel much snappier and hide layer opacity issues, but slowing it to 0.5-7 can make for a nice smooth transition for body movements and poses.



kofi support me skinny.png

© Bilvy J Lee. All artwork on this site is my own unless specifically noted.

bottom of page