Added hardware constraint in patch_hdmi.c to disable
channels 4/6 which are not supported by some older
NVIDIA GPUs.
Signed-off-by: Nitin Daga <ndaga@nvidia.com>
Acked-By: Stephen Warren <swarren@nvidia.com>
Cc: <stable@kernel.org>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
The dynamic PCM restriction based on ELD information may lead to the
problem in some cases, e.g. when the receiver is turned off. Then it
may send a TV HDMI default such as channels = 2. Since it's still
plugged, the driver doesn't know whether it's the right configuration
for future use. Now, when an app opens the device at this moment,
then turn on the receiver, the app still sends channels=2.
The right solution is to implement some kind of notification and
automatic re-open mechanism. But, this is a goal far ahead.
This patch provides a workaround for such a case by providing a new
module option static_hdmi_pcm for snd-hda-codec-hdmi module. When
this is set to true, the driver doesn't change PCM parameters per
ELD information. For users who need the static configuration like
the scenario above, set this to true.
The parameter can be changed dynamically via sysfs, too.
Signed-off-by: Takashi Iwai <tiwai@suse.de>
Cc: <stable@kernel.org>
Some newer chips have more than one HDMI output, but usually not
all of them are exposed as physical jacks. Removing the unused
PCM devices (as indicated by BIOS in the pin config default) will
reduce user confusion as they currently have to choose between
several HDMI devices, some of them not working anyway.
Signed-off-by: David Henningsson <david.henningsson@canonical.com>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
Commit bbbe33900d added functionality to restrict PCM parameters
based on ELD info (derived from EDID data) of the audio sink.
However, it wrongly assumes that the bits 0-2 of the first byte of
CEA Short Audio Descriptors mean a supported number of channels. In
reality, they mean the maximum number of channels (as per CEA-861-D
7.5.2). This means that the channel count can only be used to restrict
max_channels, not min_channels.
Restricting min_channels causes us to deny opening the device in stereo
mode if the sink only has SADs that declare larger numbers of channels
(like Primare SP32 AV Processor does).
Fix that by not restricting min_channels based on ELD information.
Signed-off-by: Anssi Hannula <anssi.hannula@iki.fi>
Reported-by: Jean-Yves Avenard <jyavenard@gmail.com>
Tested-by: Jean-Yves Avenard <jyavenard@gmail.com>
Cc: stable@kernel.org
Signed-off-by: Takashi Iwai <tiwai@suse.de>
Switch to the generic hdmi parser for codec id 1002:aa01 (ATI R6xx
HDMI), as the codec appears to work fine with it.
Note that the codec is still limited to stereo output only, despite it
reportedly being multichannel capable. Some as of yet unknown quirks
will be needed to get that working.
Testing was done on 2.6.36 by John Ettedgui.
Signed-off-by: Anssi Hannula <anssi.hannula@iki.fi>
Tested-by: John Ettedgui <john.ettedgui@gmail.com>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
Channel 2 and channel 3 were all wrongly mapped to HDMI slot 4.
This shows up as a bug that one channel is "lost" when playing in
surround41 mode.
Signed-off-by: Jerry Zhou <jerry.zhou@intel.com>
Signed-off-by: Wu Fengguang <fengguang.wu@intel.com>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
DisplayPort works mostly in the same way as HDMI, except that it expects
a slightly different audio infoframe format.
Citations from "HDA036-A: Display Port Support and HDMI Miscellaneous
Corrections":
The HDMI specification defines a data island packet with a header of 4
bytes (3 bytes content + 1 byte ECC) and packet body of 32 bytes (28
bytes content and 4 bytes ECC). Display Port specification on the other
hand defines a data island packet (secondary data packet) with header of
4 bytes protected by 4 bytes of parity, and data of theoretically up to
1024 bytes with each 16 bytes chunk of data protected by 4 bytes of
parity. Note that the ECC or parity bytes are not present in the DIP
content populated by software and are hardware generated.
It tests DP connection based on the ELD conn_type field, which will be
set by the graphics driver and can be overriden manually by users
through the /proc/asound/card0/eld* interface.
The DP infoframe is tested OK on Intel SandyBridge/CougarPoint platform.
Signed-off-by: Wu Fengguang <fengguang.wu@intel.com>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
This patch merges all three patch_*hdmi variants to the single HDMI
parser. There is only one snd-hda-codec-hdmi module now.
In this patch, the behavior of each parser isn't changed much.
The old ATI parser still doesn't use the dynamic parser yet.
In later patches, they'll be cleaned up.
Also, this patch gets rid of the individual snd-hda-eld module and
builds into snd-hda-codec-hdmi, since this is referred only from the
HDMI parser.
Signed-off-by: Takashi Iwai <tiwai@suse.de>
Intel and Nvidia HDMI codec drivers have own implementations of
sticky PCM parameters. Now HD-audio core part already has it,
thus both setups conflict. The fix is simply remove the part in
patch_intelhdmi.c and patch_nvhdmi.c and simply call
snd_hda_codec_setup_stream() as usual.
Reported-and-tested-by: Stephen Warren <swarren@nvidia.com>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
When a device is plugged over HDMI, it passes some information in ELD
including the supported PCM parameters like formats, rates, channels.
This patch adds the check to PCM open callback of HDMI streams so that
only valid parameters the device supports are used.
When no device is plugged, the parameters the codec supports are used;
it's mostly all parameters the hardware can work. This is for apps
that are started before device plugging and do probing (e.g. a sound
daemon), so that at least, probing would work even before the device
plugging.
Signed-off-by: Takashi Iwai <tiwai@suse.de>
Passing IEC 61937 encapsulated compressed audio at bitrates over 6.144
Mbps (i.e. more than a single 2-channel 16-bit 192kHz IEC 60958 link)
over HDMI requires the use of HBR Audio Stream Packets instead of Audio
Sample Packets.
Enable HBR mode when the stream has 8 channels and the Non-PCM bit is
set.
If the audio converter is not connected to any HBR-capable pins, return
-EINVAL in prepare().
Signed-off-by: Anssi Hannula <anssi.hannula@iki.fi>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
The behavior of Nvidia HDMI codec regarding the pin-detection unsol events
is based on the old HD-audio spec, i.e. PD bit indicates only the update
and doesn't show the current state. Since the current code assumes the
new behavior, the pin-detection doesn't work relialby with these h/w.
This patch adds a flag for indicating the old spec, and fixes the issue
by checking the pin-detection explicitly for such hardware.
Tested-by: Wei Ni <wni@nvidia.com>
Cc: <stable@kernel.org>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
The number of HDMI nodes is expected to go up in future.
So don't fail hard on seeing extra converter/pin nodes.
We can still operate safely on the nodes within
MAX_HDMI_CVTS/MAX_HDMI_PINS.
Signed-off-by: Wu Fengguang <fengguang.wu@intel.com>
Signed-off-by: Takashi Iwai <tiwai@suse.de>
Create patch_hdmi.c to hold common code from intelhdmi and nvhdmi.
For now the patch_hdmi.c file is simply included by patch_intelhdmi.c
and patch_nvhdmi.c, and does not represent a real codec.
There are no behavior changes to intelhdmi. However nvhdmi made several
changes when copying code out of intelhdmi, which are all reverted in
this patch. Wei Ni confirmed that the reverted code actually works fine.
Tested-by: Wei Ni <wni@nvidia.com>
Signed-off-by: Wu Fengguang <fengguang.wu@intel.com>
Signed-off-by: Takashi Iwai <tiwai@suse.de>