Hardware and driver support page
Interested in tvtime? Having trouble? Have suggestions? Why not hang
out with us on IRC at
irc.freenode.net, channel
#livid. Hope to see you there!
There is also a
tvtime development mailing list
that you may want to subscribe to.
The tuner driver defaults to detecting a PAL tuner on many NTSC
capture cards, a notable example being the ATI TV Wonder cards. This
causes all of the cable frequencies to be out of alignment, channel
numbers are off-by-one and slightly detuned. To fix this, the tuner
module must be told explicitly the tuner type when it is loaded by doing
the following:
modprobe tuner type=2
You may have to first remove the bttv module if it is already loaded.
To make this change automatic, in your modules.conf file, add the
following line:
options tuner type=2
This will ensure that whenever the tuner module is loaded, it will use
the correct tuner for this card. If this does not fix the problem, or
if you have an ATI TV Wonder that does NOT exhibit this problem,
please post a bug report on
the tvtime bugs page.
To attack this problem, we believe that the tuner module must be fixed
to correctly detect the difference in tuner, as so many users are affected
by this problem. To help with this, please follow up on
bug 711428.
If you do not have an ATI TV Wonder card, and you were one of the users
of the "canada-cable" frequency table, then you're likely
suffering from the same problem as above for the ATI TV Wonder card: an
incorrectly identified tuner. Please post a bug report on
the tvtime bugs page
indicating which type of card you have, and whether explicitly setting
the tuner type helped.
This section contains information about specific features and
dependancies in tv for the popular
bttv driver.
4.1. Optimal driver configuration: The gbuffers setting
Most of the advanced deinterlacing algorithms require a history
of past input frames in order to predict motion in the video stream.
For best performance, tvtime requires that these history buffers be
provided by the driver itself. While the bttv driver in kernel 2.4.21
and later provides four buffers by default, most versions only provide
two.
Ensuring that your driver provides four buffers will give you access
to all of the available deinterlacing methods. To do this you must
provide the following option when installing the bttv driver:
modprobe bttv gbuffers=4
To make this change automatic, in your modules.conf file,
add the following line:
options bttv gbuffers=4
This will ensure that whenever the bttv module is loaded, it will
provide four frames to applications.
4.2. Special bttv luma correction mode
The popular Bt8x8 chip outputs non-standard colour values. While
most video overlays are calibrated for input in the range of 16-235 for
the luma component, and 16-240 for the chroma components, the Bt8x8
chip has two operating modes, both of which output values in a non-standard
way. tvtime has as a special feature of the luma correction filter
the ability to correct for this, ensuring slightly-more-correct colour
for these cards. This correction is enabled automatically if you have a
Bt8x8-based capture card and turn on the luma correction
filter.
To turn on luma correction, just hit the 'c' key. A value of
1.0 will do nothing but apply the bttv luma correction if you have
an appropriate card.
4.3. bttv takes a long time to load!
On some cards without a tuner, bttv can take a long time to load (on the
order of minutes). To fix this, try using this option for the
i2c-algo-bit module:
options i2c-algo-bit bit_test=1
If that does not help, please post a bug report.
ATI has a line of video cards with video capture capabilities, and the
GATOS project develops
drivers for these cards which includes km, a kernel driver
providing a Video4Linux interface to the capture components on these
cards.
Note that this is just for video cards with capture capabilities,
not for the ATI TV Wonder line of capture cards which use the
Bt878 chip and are supported by the bttv driver.
The Video4Linux interface provided by GATOS
is currently not working with tvtime, and at this point it seems
unlikely that they ever will. The problems are:
- Rumor has it that the All-In-Wonder cards cannot both capture and
allow an application to use the hardware overlay scaler of the card. If
this is true, then tvtime would be unable to do any processing on the
video.
- The Video4Linux interface provided by the km driver is nonstandard.
It provides at field rate instead of frame rate, and does not seem to
provide a way of detecting this. This may require additional support in
tvtime.
- The km driver only supports read() mode, which is currently
untested in the tvtime code. If someone could test this out, I'd
appreciate it.
Assistance in investigating this for conclusive information would be
appreciated. Please contact the tvtime developers if you can help
out.
Recently someone tried to use tvtime with the zoran driver for the
DC10plus and related cards that is part of a larger project called
'mjpegtools'. This is an older card, and depending on your PCI
configuration, many users with seemingly high powered machines
experience extremely poor performance. To quote an email with one of
the driver authors:
From: Serguei Miridonov <mirsev@cicese.mx>
> A user of your driver and I were trying to figure out why
> tvtime wasn't working. tvtime operates by grabbing interlaced
> frames at full rate (29.97/25fps), 4:2:2 Y'CbCr sampling, and
> deinterlacing them to give an output at field rate (but increased
> to frame size) of 59.94fps/50fps progressive. The zoran driver,
> when capturing at 640x480 4:2:2, was providing nasty green
> frames/colourbars instead of providing any reasonable video
> content.
>
> The user determined that this is because of some memory
> limitations of the card, it can't handle this data rate,
It depends on the Zoran PCI controller (new versions work better than
the old one), and on the motherboard. Intel is much better than VIA,
for example...
> and so only lower resolutions like 320x240 or something can be
> provided when asking for raw 4:2:2 images, and that some sort of
> bigphysmem thing is required to get it to work, if at all.
>
> This restriction seems pretty reasonable given the hardware
> involved, so that is not my concern. My concern is simply that
> the driver did not fail or anything, but fooled tvtime into
> thinking everything was 'ok'. I would very much appreciate it if
> the driver could return some error when it is set in a mode that
> will not work. Is this at all possible?
It's not so simple. I can only guess which hardware combination will
work and which will not (see above). And the hardware provides no
means to detect any buffer overflow for uncompressed video capture.
Perhaps, this is because the card was primarily designed for MJPEG
capture (low bandwidth) and to provide video monitoring on the
computer display (no RAM access, just the PCI-PCI transfer from Zoran
to any graphics adapter. It can capture uncompressed video into memory
even at full frame rate but not on every system. My old system with
Pentium MMX 266MHz with Intel 430TX works better than newer Athlon XP
1800+ at VIA KT266A and DDR memory. Pentium MMX system can capture
uncompressed video to memory but can do nothing with it due to CPU and
disk I/O limitations. Athlon system can do anything (even encode into
MPEG-4 in real time) but some uncompressed frames are dropped due to
PCI limitations on VIA chipset...
Marcel Birthelmer wrote:
> Sorry for the misunderstanding, i wasn't complaining about a bug.
> I looked through zoran.c and i realized that the reason for
> grabdisplay not working was the 128 limit,
This is a limit from Linux kernel: it will not give you contiguous
memory longer than 128Kbytes without some special tricks: bigphysmem
patch or reserving memory at system boot using mem=XXX line in LILO or
GRUB configs.
> so i tested it, and grabdisplay does work with very small picture
> sizes. So my question is, what are my options for increasing this
> memory amount?
Bigphysmem patch or reserving memory at boot time. I suggest you to
read README file: it has notes about this.
> just changing 128 to 1024 doesn't seem to work, but also i don't
> really want to patch my kernel unless i have to. Any
> suggestions?
If you need help, please, describe your system: motherboard chipset,
processor, memory size and type, etc. Also please note that older
ZR36057 chip may not work properly on some motherboards. Newer
ZR36067 is better but it also may have problems with VIA chipsets.
A user has reported an unsuccessful attempt to use tvtime with
a G400 marvel card supported by Matrox Marvel driver project.
To quote:
Its taking 38ms to grab a frame, but the marvel does
all its own scaling in hardware (ie not via Xv, well I
dont think it does) the anoying thing is that if the
hardware scaling is in use Xv cant be used.
It would be nice if tvtime supported overlay mode so
that us marvel users can at least use the 16:9 mode in
fullscreen.
I think we may be able to get this card working, but it will take
someone with a bit of know-how to figure out what the card expects,
and I guess the ability to shut off whatever hardware scaling prevents
us from using XVideo.
By default, the rivatv the driver copies frames from video memory into
system memory, which can harm performance. Performance can be improved by
using DMA to transfer the frames, however, this option only works when
using the open-source "nv" driver in X, not with the NVIDIA binary
drivers.
You can turn on dma support as follows in your modules.conf file:
options rivatv capbuffers=4 dma=1
An unmaintained TV card driver for the Voodoo 3500 card is located
here:
http://sourceforge.net/projects/v3tv/
However, this driver does not support capturing, and therefore cannot be
used with tvtime.
The XVideo drivers in XFree86 versions up to 4.2.1 exhibit extremely poor
performance. This is fixed in later versions of the driver. If you own a SiS
card, please upgrade to the latest driver by Thomas Winischhofer, or upgrade to
X 4.3 which includes a recent version of his driver. The web page for the SiS
X driver is here:
http://www.winischhofer.net/linuxsis630.shtml
The tvtime bug report to on this is bug
636338.
Around the end of 2002, it was noted that with the ATI Radeon
7000 card and the XFree86 4.2.1 packages in debian, XVideo windows
are "always on top" and cannot be minimized. This was also seen in
RH8.0's XFree86 4.2 with a Radeon 32MB DDR, R100 QD (a 7200
card?). We have not researched this issue enough to know the extent of
what cards are affected, or what versions will fix this. More
information on this would be appreciated.
There is a bug in versions of the NVIDIA driver before
1.0-4349 and the nv driver in 4.3.0 and earlier for cards
which support wide filter kernels for high quality XVideo surfaces. If
the overscan value is larger than 0, then the card uses memory outside
of the image for interpolating around the edges, causing corruption all
around the outside of the tvtime window.
NVIDIA was made aware of the problem and fixed it for driver release
1.0-4349 and in the nv driver in CVS XFree86. For
reference, the tvtime bug on this was bug
694144.
The open source NVIDIA driver for X ("nv" not "nvidia") does not support XVideo surfaces for
pre-GeForce cards like the TNT2. To use tvtime with these cards, you must use
the binary drivers.
That said, the binary drivers should give better performance for XVideo than the open-source
driver, since it can use the kernel module component to negotiate DMA transfers for video
frames. If possible, stick to the binary drivers when using tvtime on NVIDIA hardware.
14.1 Corrupted frames
The TNT and TNT2 cards are very limited in their bandwidth, and this
problem will appear as horizontal lines in the video output unless
you're running at a fairly low resolution. Smaller output window sizes
will make the problem worse, as the DAC will have to read out more data
in the same amount of time. This problem should not occur with any of
the Geforce series cards.
14.2 Card can't downscale
A user of a NV5: RIVA TNT2 Ultra (rev 11) noted that the card
cannot downscale video, it only crops it. This user was using version
1.0-4191 of the NVIDIA X drivers.
14.3 Poor performance at high resolution and refresh rate
A user of a TNT2 Vanta card reported that blit performance was
absolutely terrible until they went down to a resolution of 800x600, at
which point speed quadrupled. We believe this is due to bandwidth
problems on these cards.
The bttv driver seems to misdetect Prolink Pixelview cards pretty badly. There are
five Prolink cards listed in CARDLIST for the bttv driver:
card=16 - Prolink Pixelview PlayTV (bt878)
card=37 - Prolink PixelView PlayTV pro
card=50 - Prolink PV-BT878P+4E /
PixelView PlayTV PAK /
Lenco MXTV-9578 CP
card=70 - Prolink Pixelview PV-BT878P+ (Rev.4C)
card=72 - Prolink Pixelview PV-BT878P+9B
(PlayTV Pro rev.9B FM+NICAM)
The bttv driver cannot seem to autodetect these cards correctly. We've
found that cards which should use card=70 but are instead
loaded using card=72 are causing audio mute/unmute to fail
randomly, or even be inverted, resulting in no audio or just snippets of
audio. To avoid this, please make sure you are using the correct
variant of this card. Specifically, the rev 9D of the newer Pixelview cards
should use card=70 and not card=72.
We have had a bug report about the linux-dvb driver found on the
linuxtv.org page. This supports, among
other things, the Hauppauge Nexus-S cards.
The 0.9.8 release of tvtime fails on the current released stable driver, as it
does not properly indicate what it supports. As well, there are performance
difficulties. From Michael Hunold, the driver author:
to make a long story short: "linuxtv-dvb-1.0.0-pre1" and the so-called
DVB "head" or "newstruct" drivers are v4l1 drivers.
The CVS "dvb-kernel" driver is the new *v4l2* driver, which does not
have the problems you mentioned. YUYV byteorder is completely disabled,
capture performance problems should be gone, because I use Gerd Knorr's
"video-buf" capture buffer abstraction. Because of that, the performance
should be equal to the bttv / saa7134 driver.
tvtime 0.9.9 will work fine with this new V4L2 driver, and we recommend
that users of this card get the dvb-kernel driver and use that with tvtime.
For reference, the original project page for this driver is at
http://www.gdv.uni-hannover.de/~hunold1/linux/saa7146/index.html
and and an explanation of the current DVB driver situation is at
http://linuxtv.org/dvb/drivers.xml.
As well, the thread on this issue on the linux-dvb mailing list can be found here:
http://linuxtv.org/mailinglists/linux-dvb/2003/03-2003/msg00020.html and the tvtime
bug report is bug 694544.
We've investigated a bug where the savage X driver does not initialize its
XVideo subsystem unless it starts on a modeline where the resolution equals the virtual
resolution. That is, if you have a virtual resolution of 1600x1200, make sure you start
on a 1600x1200 mode, otherwise XVideo won't initialize.
This bug was noted on the XFree86 devel list here:
http://www.mail-archive.com/devel%40xfree86.org/msg01248.html
Overlay mode, as can be used in xawtv or zapping, causes system
instabilities. This is because their 'overlay' mode allows the capture
card to write directly to video memory without locking it first, which
is unsafe for many video cards, specifically those which do not use a
linear framebuffer (such as NVIDIA cards with the binary drivers).
Use of the xawtv/overlay mode has caused system crashes, and
specifically, can cause tvtime to crash if it tries to use XVideo
features after the framebuffer area has been destroyed by the capture
card (see
tvtime bug 702539).
A workaround for having a no-CPU-access mode is to use the XFree86 "v4l"
extension, which allows safe use of the framebuffer by capture cards.
tvtime does not support using this mode, nor do we have plans to, as we
would not be able to composite our OSD on top of the video signal or
provide any processing features on the video.
|