Synchronizing Input events to the screen refresh

The issue

This discussion is a follow up to the following issues:

The problem is that input events such as a touchpad “2-finger scroll” gesture are not synchronized to the screen refresh (‘vblank’) events. That results in jittery animation. Please refer to the GTK bug report above for example videos and detailed description.

As an example, my external Apple ‘trackpad’ V1 generates around 90 scroll events per second. That means that on average, on a 60FPS display we get 1 scroll event on the first frame, 2 events on the second frame, 1 on the third, 2 on the forth etc, and sometimes we get 2 events twice in a row. If, say, each event results in a 4px movement, we’ll get 4px on the first frame, 8px on the second, 4px on the third, 8px on the forth etc. This causes a visible jitter.

Example scenarios

Scenario 1

An external touchpad which sends 90 events per second is connected to a computer with a 60Hz display. Here we get 1 scroll event on the first frame, 2 events on the second frame, 1 on the third, 2 on the forth etc, and sometimes we get 2 events twice in a row.

Scenario 2

An internal touchpad with a poll interval of 125Hz is connected to a computer with 240Hz display. In this case an input event is only received every other frame, and once in a while it will be received 2 frames in a row.

Scenario 3

A standard USB mouse is connected to a monitor with a variable refresh rate, a.k.a ‘adaptive sync’, ‘free sync’ etc. Here we can get varying number of input events per frame.

Scenario 4

A standard 125Hz USB mouse is connected to a standard monitor with a 60Hz refresh rate. There will be a jank frame about 5 times per second. That’s because we’ll get on average 2 input events per display frame, but for each second there are 5 ‘extra’ input events (125 modulo 60).

The proposed solution

The input events should be resampled to the screen vblank times so that only a single, interpolated input event is processed in each frame. Absolute input events can be directly resampled whereas relative events should be converted to absolute events first, for example by calculating the cumulative sum of the deltas.

Show me the code…

Following is an excerpt from a POC implementing that solution for scroll gestures in GTK (the complete code can be found in https://gitlab.gnome.org/yarivb/gtk under the branch yariv-test-scroll):

/* we have 2 points */
ScrollHistoryElem *first_elem;
ScrollHistoryElem *second_elem;
gdouble ratio;

first_elem = &g_array_index (scroll->scroll_history, ScrollHistoryElem, i);
second_elem = &g_array_index (scroll->scroll_history, ScrollHistoryElem, i + 1);

ratio = (gdouble)(interpolation_point - first_elem->evtime) /
                 (second_elem->evtime - first_elem->evtime);

interpolated_x = (ratio * second_elem->x) + ((1.0 - ratio) * first_elem->x);
interpolated_y = (ratio * second_elem->y) + ((1.0 - ratio) * first_elem->y);

/* synthesize the interpolated scroll event */
interpolated_item->dx = interpolated_x - scroll->total_x_offset;
interpolated_item->dy = interpolated_y - scroll->total_y_offset;

Here interpolation_point is the frame time. Under GTK that is obtained from gdk_frame_clock_get_frame_time(). The objects first_elem and second_elem hold 2 consecutive scroll history events. first_elem->{x,y} and second_elem->{x,y} are total offsets (sum of scroll deltas) from when the scroll gesture started.

We assume that the interpolation point lies between the 2 events, i.e. first_elem->evtime < interpolation_point < second_elem->evtime. This excerpt implements a simple linear interpolation, though of course other interpolation methods could be used.

1 Like

Where should the interpolation be done

Possible places - pros and cons

We could implement the interpolation only in GTK. However in that case only GTK-based applications will have that feature. If we only implement that in GNOME Shell, other mutter-based environments will lack the feature. Likewise if we implement in Mutter then other compositors will have to implement that themselves. If it is implemented in libinput then other environments such as wlroots-based Wayland compositors will get that ‘for free’, as well as X11. However with regard to libinput, I believe that it has no notion of display frames and in fact that it shouldn’t have such a notion.

Applications also have a say

Not all applications will require this interpolation. Games, for example, prefer to receive raw events and get them as soon as possible. Drawing applications might want to have interpolation applied to certain widgets, such as the toolbox, layers, etc. However these applications will not want interpolation applied for the drawing area, since getting as many input events as possible helps drawing smoother curves.

So if we implement the interpolation in libinput (assuming it’s possible) or in Mutter, we have to either somehow communicate to mutter when not not apply interpolation, or just always send raw events to the clients and have them handle that themselves, which again brings us to having to implement it in the toolkits.

X11/Wayland

Another point to consider is that, IIUC, under X11 input events are directly delivered to the clients. So for X11 we’ll need to either implement it in the X server, or have separate solutions for the toolkit (e.g. GTK) and for the window-manager/compositors. Here again the problem of some clients not wanting it, or requiring it only for certain widgets, arises.

Under Wayland a compositor-only solution might be fine, but then we’ll have to somehow tell the compositor, for each client application, to which widgets it should send raw events and which widgets should receive interpolated ones… This is one of the reasons that I believe that widget toolkits should implement that regardless of any implementation for the compositor.

1 Like

On a second thought I was wrong about libinput. An implementation in libinput can at the least be shared among the X server and the various Wayland compositors.

The problem of client applications remains, though, since for some applications low latency and/or high input event resolution are preferred.

Yes. And if you handle in it GDK, like in that issue, GTK clients would still always get throttled events.

Just one more data point: I tested a 2nd generation Apple external ‘trackpad’, which supports both USB and Bluetooth connectivity. On both the event rate is about 88Hz - just like the (Bluetooth-only) 1st generation.

That means that currently all Apple external trackpad users suffer from this issue.

I’m really not sure what problem is being solved? Input runs at input frequency and display runs at display frequency. GTK accounts for that by having different input and redraw events, so what issue is this solving? That a variable refresh rate monitor might be so fast that it outpaces the input?

Typically, programs which ‘fill in the gaps’ of some input frequency do so by interpolation. How they interpolate is often very significant to the function of the program. So maybe this could be a call to a function, something like get_interpolated_input_at_time. That would give the developer a chance to use one of the supported interpolation method if they want it rather than having one forced by GTK.

Input runs at input frequency and display runs at display frequency

That’s exactly the problem.

Supporting interpolation as a solution to this problem is basically the idea, only we are talking about gtk itself here, along with the rest of the UI stack (mutter, gnome-shell etc). Right now gtk doesn’t do any interpolation for its own widgets. That results in jittery animation on certain setups.

Please take a look at the videos attached to the original issues, e.g. gtk#2025 . I think that they do a good job of demonstrating the issue. You’ll have to download them and view locally with mpv to see the difference. The jittery one was compiled from gtk master at the time, and the smooth one has a patch implementing interpolation. The same bug report also has some example data which should make the problem clear. If it isn’t clear please let me know and I’ll try to improve that.

Basically any gtk-based app is jittery on my machine whenever continuous touchpad input is involved. In fact, since gnome-shell/mutter don’t do interpolation either, even simply moving the mouse or dragging a window is jittery.

I watched the videos and I do see the problem now. My suggestion would be velocity-based/kinnetic scrolling rather than strictly interpretation. The scroll in those videos does look mechanical but a lot of it is about it strictly corresponding to input. For example, there is a clunky halt when the user lifts their finger. However, I see this is already discussed in the issue thread.

Apart from ‘throwing’ the scroll I would also suggest having spring loaded edges. So that, if the user tried to scroll the item beyond its bounds, it moves a little but resists the motion as they push further and springs back to the boundary of the scrolled item when they let go. You can see this behavoir with any list on almost a touchscreen phone.

However, some people do seem to like that very immediate response.

Kinetic scrolling is actually already supported in the existing code, well before this POC. I think that when recording the session I halted my fingers before lifting them up, in order to minimize the kinetic scroll effect. This was done to prevent confusion about the issue - people would see the smooth kinetic scroll and misunderstand that the issue is with panning while the fingers touch the touchpad. As for ‘spring’ edges, GTK already implements something similar though IIUC not exactly spring because this is patent-encumbered or something like that.

Regarding animation vs interpolation

Animation is ok for scrolling and “stick to finger” gestures such as workspace switch, window overview etc. However it isn’t fit for cursor movement and related actions such as dragging windows because it will completely change the acceleration profile. Also animation might not be the right choice for touchscreens since it uses it’s own velocity, resulting in animation that doesn’t keep up with the finger position.

Event interpolation is applicable to both touchpads as well as touchscreens, and is doing a good enough job for all kinds of input interactions - mouse movement, window dragging, scrolling etc. Since it is also easier to implement, I think that interpolation is the right way forward.

Internal vs external touchpads

People using internal touchpads often don’t notice the issue. The reason comes down to the magnitude of the jitter, and of course individual sensitivity to jittery animation in general.

Apples external touchpads have an event rate of about 90Hz. Internal touchpads commonly have a rate of 125Hz. For screen refresh rate of 60Hz, we’ll get an alternating pattern of 1,2,1,2,1,2,… events from external touchpads for each display frame. For internal touchpads we’ll get 2,2,2,2,2,2,2,2,2,2,2,3,2,2,2,2,2,2,2,2,2,2,2,3,2… i.e. most frames will receive 2 input events and every 12th frame will receive 3.

A couple of things are immediately apparent. For external touchpads every frame is jittery, while for internal touchpads we get 5 jank frames per second (1 jank per 200ms). The second, and more important, issue is the magnitude of the jitter. For external touchpads the magnitude is 100%: 1 event/frame vs 2 events/frame. For internal touchpads, the jank frames magnitude is only 50%: 2 events/frame vs 3 events/frame.

So while the issue definitely exists for internal touchpads, it is much less severe compared to external ones.

The external touchpads are probably more jittery because of slower bus? That can happen with internal touchpad too if it uses PS/2. For example, with Synaptics touchpads that don’t have smbus/intertouch enabled. The solution is of course to enable it for this specific device.

And it’s not just scrolling, it’s scrolling and zooming and swiping. Everything that uses 2 or more fingers: edge scroll is smooth. So it’s still libinput probably. :slight_smile:

Oh, btw, please try edge scrolling too. Is it same?

Well the Apple external ‘Trackpad’ Gen2 can also be connected over USB, and it had the same event rate of about 88Hz regardless if used over Bluetooth or USB. It also exhibited the same jittery behavior.

About it not being just about scrolling you are absolutely right. I can notice the issue even when I just move the mouse cursor around. When dragging windows across the screen it is even easier to see. In addition, the 4-finger ‘stick-to-finger’ workspace switch gesture on GNOME Shell 3.32 is jittery as well, but that might also be attributed to performance issues.

Regarding edge scrolling the behavior is the same, as well as the event rate. Speaking of which, there is a nice little utility called evhz for measuring input devices event rates. It directly opens the input devices so the results are unaffected by libinput, X/Wayland compositor etc. As expected, the number of events/second it reports is the same as the number of events/second received by e.g. the gtk demo app.

No, the workspace gesture is very fast. It does have issues, but with snap-back animation (see https://gitlab.gnome.org/GNOME/gnome-shell/merge_requests/605), but the stick-to-finger part of it should be perfectly smooth. So it is the event rate.

The edge scrolling was different here because the problem happened with 2 or more fingers. Hence it happened with 2-finger scrolling, but not with edge scrolling, which is 1-finger. Once I added my device to the smbus whitelist (https://patchwork.kernel.org/patch/10910505/), it’s perfectly smooth.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.