Skip to main content

Tuning performance in RI 1.2.1

5 replies [Last post]
abysubin
Offline
Joined: 2011-07-31
Points: 0

Hi,

In RI 1.2.1, after removing the UAL and DRI, a performance hit has been introduced in tuning. This is because of a Glib call made in Tuner.C (ri_platform/src). This has resulted in failure of one of my tuning performance test case which requires tuning to happen within 1500 ms.

It is observed that whenever a tuning request is being made, it takes around 1000ms to change tuner state from NO_SYNC state to SYNC state.

Root Cause Identified:
In normal flow, on getting a tuning request, we are requesting tuning to tuner module. When the tuning is success, we are trying to see whether the signal lock has happened. For this, we are polling the tuner to get the status. Once we obtain the locked state, we are sending the SYNC event .

The performance hit happens in the polling mechanism. The polling operation is done using a Glib API - g_timeout_add_seconds() in which we pass 'time' in seconds, a 'function' and an 'object' as parameters. This API sets the function to be called at regular intervals until the function returns false. But the issue is that, the exact precision of the INITIAL TRIGGER of this function call shall be in SECONDS. ie the initial trigger need not happen immediately. So, here, the function is invoked at a later stage (in around 1000 ms), which makes the performance tests fail.

Proposed Fix:
In Glib API doc of g_timeout_add_seconds() , it is clearly mentioned as follows:
"Note that the first call of the timer may not be precise for timeouts of one second. If you need finer precision and have such a timeout, you may want to use g_timeout_add() instead. " (http://developer.gnome.org/glib/2.30/glib-The-Main-Event-Loop.html#g-timeout-add-seconds)

Now, replaced the current call with g_timeout_add(), which takes the timeout interval in milliseconds. Gave timeout as 1000ms. It is working fine and it changes from NON_SYNC state to SYNC state immediately. The performance test cases are passing consistently.

Why the API- g_timeout_add_seconds() is being used instead of g_timeout_add() ? Not to over burden the processor ? Or expecting a delay for signal lock? But, if i use the g_timeout_add() with interval as 1000ms, it will not be of that much burden, right? Also, right now, the signal lock is happening instantly.. Since tuning performance is very critical, cant we this API be used instead?

Regards

Aby

Reply viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
smaynard
Offline
Joined: 2009-01-27
Points: 0

So there is no reason for not using g_timeout_add() other than the RI platform code on the PC is not optimised. We can (and will) make this change for the next release.

abysubin
Offline
Joined: 2011-07-31
Points: 0

Hi smaynard,

The fix has come up in next release. But the interval that has been given for polling is 500ms. This still makes our performance test fail sometimes. The initial trigger of g_timeout_add() will be after the given interval (here, it is 500ms). Hence, in every tune operation, we are waiting 500ms simply, without doing anything, and this is a considerable amout of time in performance figures. Can we reduce the interval further to sometime around 100-150ms ? Anyway, we are simply waiting in the main thread until this polling operation. So, frequency of polling will not be of that impact, correct?

Also, since initial trigger is happening after the given time interval, before entering into the poling operation itself, can we check the status of tuner once, as shown below ?

if (TRUE == getTunerStatus(object))
{
// poll (every xxx ms) and get/send the tuner status to the stack...
(void) g_timeout_add(xxx, (GSourceFunc) getTunerStatus, object);
}

Regards

Aby

smaynard
Offline
Joined: 2009-01-27
Points: 0

So while testing these changes, the immediate test of if (TRUE == getTunerStatus(object)) is causing false positives during fast tune scenarios (i.e. rapid channel up/down). Observing tunes with VLC, GST, or HDHR; that fastest tuning timing is near 225ms, and usually over 250ms. What tuner type are you using and how are you achieving tunes in less than 150ms?

smaynard
Offline
Joined: 2009-01-27
Points: 0

I will make the requested changes and test them...

ONLY MAKING THE TIMING CHANGE - SEE ADDITIONAL COMMENT

abysubin
Offline
Joined: 2011-07-31
Points: 0

Thanks for the update.

I have not tested rapid tuning test cases. I'm using VLC tuner. I was looking for option to improve the performance, and hence suggetsed to look for the tuner status, before entering into polling (which anyway will get initially triggered only after the given polling interval). I'm getting tune status immediatedly while testing normal channel selection. The polling interval suggeted (100 -150 ms) was also with the same intention of improving the performance. Even if it takes around 250 ms, i could send the SYNC event in the 2nd or 3rd polling (ie in 300ms). It would be better to send the Tune SYNC event as early as possible. right?

Looking forward for your testing results.

Regards

Aby