WebKit Bugzilla
New
Browse
Log In
×
Sign in with GitHub
or
Remember my login
Create Account
·
Forgot Password
Forgotten password account recovery
RESOLVED FIXED
70155
Add AudioSourceProviderClient and setFormat() method so we can know audio stream format
https://bugs.webkit.org/show_bug.cgi?id=70155
Summary
Add AudioSourceProviderClient and setFormat() method so we can know audio str...
Chris Rogers
Reported
2011-10-14 16:06:33 PDT
Add MediaPlayer::interceptAudioStream() method so we can know audio stream format
Attachments
Patch
(7.08 KB, patch)
2011-10-14 16:09 PDT
,
Chris Rogers
no flags
Details
Formatted Diff
Diff
Patch
(7.54 KB, patch)
2011-10-21 12:51 PDT
,
Chris Rogers
no flags
Details
Formatted Diff
Diff
Patch
(7.80 KB, patch)
2011-10-24 17:00 PDT
,
Chris Rogers
eric.carlson
: review+
webkit.review.bot
: commit-queue-
Details
Formatted Diff
Diff
Show Obsolete
(2)
View All
Add attachment
proposed patch, testcase, etc.
Chris Rogers
Comment 1
2011-10-14 16:09:21 PDT
Created
attachment 111104
[details]
Patch
Chris Rogers
Comment 2
2011-10-14 16:15:56 PDT
The introduction of MediaPlayer::audioSourceProvider() was a good first step at gaining access to the audio stream from an HTMLMediaElement, and it *basically* works with a prototype in Chrome. However, because the sample-rate of this stream may differ from the AudioContext sample-rate, a conversion may be necessary. Moreover, the .src attribute may be changed dynamically during <audio>/<video> playback resulting in changes to the number of channels and sample-rate. This patch adds a callback mechanism by adding AudioSourceProvider::Client which has a setFormat() method to receive the initial format and any subsequent changes to the audio stream format. A subsequent patch will actually make use of this information to dynamically configure any necessary converters.
Eric Carlson
Comment 3
2011-10-15 10:23:45 PDT
Comment on
attachment 111104
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=111104&action=review
> Source/WebCore/ChangeLog:8 > + No new tests. This is covered by existing MediaElementAudioSourceNode tests.
This comment isn't correct, the existing tests don't cover the new code. It can't be tested because it isn't completely functional yet.
> Source/WebCore/ChangeLog:22 > + * html/HTMLMediaElement.cpp: > + (WebCore::HTMLMediaElement::createMediaPlayer): > + (WebCore::HTMLMediaElement::setAudioSourceNode): > + * platform/audio/AudioSourceProvider.h: > + (WebCore::AudioSourceProvider::Client::~Client): > + * platform/graphics/MediaPlayer.cpp: > + (WebCore::MediaPlayer::interceptAudioStream): > + * platform/graphics/MediaPlayer.h: > + * platform/graphics/MediaPlayerPrivate.h: > + (WebCore::MediaPlayerPrivateInterface::interceptAudioStream): > + * webaudio/MediaElementAudioSourceNode.cpp: > + (WebCore::MediaElementAudioSourceNode::setFormat): > + * webaudio/MediaElementAudioSourceNode.h:
I would prefer to have comments about what changed here.
> Source/WebCore/html/HTMLMediaElement.cpp:3019 > void HTMLMediaElement::setAudioSourceNode(MediaElementAudioSourceNode* sourceNode) > { > m_audioSourceNode = sourceNode; > + > + if (m_player) > + m_player->interceptAudioStream(m_audioSourceNode); > }
I am not wild about the name "interceptAudioStream". Why do HTMLMediaElement and MediaPlayerPrivate have different names?
> Source/WebCore/platform/graphics/MediaPlayer.h:334 > + // interceptAudioStream() taps into the audio stream (disabling the normal playback path). > + // It should be called on the main thread. > + // The client will be called back (on the main thread) when the audio format is available. > + void interceptAudioStream(AudioSourceProvider::Client*);
Does it necessarily disable the normal playback path? For example, what happens when play is called on a <video> element that has audio and video? I would expect that the video plays in sync with the processed audio. If the normal playback path is disabled, who is responsible for maintaining sync?
Chris Rogers
Comment 4
2011-10-15 10:54:30 PDT
Hi Eric, thanks for having a look. I'll try to partially answer some of your comments here. (In reply to
comment #3
)
> (From update of
attachment 111104
[details]
) > View in context:
https://bugs.webkit.org/attachment.cgi?id=111104&action=review
> > > Source/WebCore/html/HTMLMediaElement.cpp:3019 > > void HTMLMediaElement::setAudioSourceNode(MediaElementAudioSourceNode* sourceNode) > > { > > m_audioSourceNode = sourceNode; > > + > > + if (m_player) > > + m_player->interceptAudioStream(m_audioSourceNode); > > } > > I am not wild about the name "interceptAudioStream". Why do HTMLMediaElement and MediaPlayerPrivate have different names?
I'm not too attached to any names here. Would you suggest that I use "setAudioSourceNode" also for MediaPlayer/MediaPlayerPrivate instead of "interceptAudioStream"?
> > > > Source/WebCore/platform/graphics/MediaPlayer.h:334 > > + // interceptAudioStream() taps into the audio stream (disabling the normal playback path). > > + // It should be called on the main thread. > > + // The client will be called back (on the main thread) when the audio format is available. > > + void interceptAudioStream(AudioSourceProvider::Client*); > > Does it necessarily disable the normal playback path?
I think it really should, because if the idea is to apply a filter of some kind, such as a graphic EQ or any other effect, then if the normal playback path is not disabled, the effect is not the desired one because we would have the EQ'd sound mixed in with the full "dry" original sound. Or, if the idea is to split out a single channel (left-channel for instance) and not play the right channel...
> > For example, what happens when play is called on a <video> element that has audio and video? I would expect that the video plays in sync with the processed audio. If the normal playback path is disabled, who is responsible for maintaining sync?
The implementation still needs to continue to maintain sync between video and audio. The difference being that the MediaElementAudioSourceNode is now pulling the audio stream instead of the audio hardware (AUGraph probably in the mac port case). The exact implementation details will vary from port-to-port. It may be worth talking about some of these details offline in IRC/phone.
Eric Carlson
Comment 5
2011-10-15 11:23:08 PDT
(In reply to
comment #4
)
> Hi Eric, thanks for having a look. I'll try to partially answer some of your comments here. > > (In reply to
comment #3
) > > (From update of
attachment 111104
[details]
[details]) > > View in context:
https://bugs.webkit.org/attachment.cgi?id=111104&action=review
> > > > > Source/WebCore/html/HTMLMediaElement.cpp:3019 > > > void HTMLMediaElement::setAudioSourceNode(MediaElementAudioSourceNode* sourceNode) > > > { > > > m_audioSourceNode = sourceNode; > > > + > > > + if (m_player) > > > + m_player->interceptAudioStream(m_audioSourceNode); > > > } > > > > I am not wild about the name "interceptAudioStream". Why do HTMLMediaElement and MediaPlayerPrivate have different names? > > I'm not too attached to any names here. Would you suggest that I use "setAudioSourceNode" also for MediaPlayer/MediaPlayerPrivate instead > of "interceptAudioStream"? >
Yes, I think it makes sense to use the same names.
> > > > > > > Source/WebCore/platform/graphics/MediaPlayer.h:334 > > > + // interceptAudioStream() taps into the audio stream (disabling the normal playback path). > > > + // It should be called on the main thread. > > > + // The client will be called back (on the main thread) when the audio format is available. > > > + void interceptAudioStream(AudioSourceProvider::Client*); > > > > Does it necessarily disable the normal playback path? > > I think it really should, because if the idea is to apply a filter of some kind, such as a graphic EQ or any other effect, then if the normal playback path is not disabled, the effect is not the desired one because we would have the EQ'd sound mixed in with the full "dry" original sound. Or, if the idea is to split out a single channel (left-channel for instance) and not play the right channel... >
OK, I misunderstood the comment - I assumed it meant that the playback was completely disconnected from the media element. I am not sure how to clarify the comment, maybe say something about how the audio is processed by the graph before being rendered?
> > > > > For example, what happens when play is called on a <video> element that has audio and video? I would expect that the video plays in sync with the processed audio. If the normal playback path is disabled, who is responsible for maintaining sync? > > The implementation still needs to continue to maintain sync between video and audio. The difference being that the MediaElementAudioSourceNode is now pulling the audio stream instead of the audio hardware (AUGraph probably in the mac port case). The exact implementation details will vary from port-to-port. It may be worth talking about some of these details offline in IRC/phone.
> Agreed, we should talk about these details soon and get some text into the spec.
Daniel Bates
Comment 6
2011-10-17 00:49:49 PDT
Comment on
attachment 111104
[details]
Patch
Attachment 111104
[details]
did not pass mac-ews (mac): Output:
http://queues.webkit.org/results/10078857
Chris Rogers
Comment 7
2011-10-18 15:42:48 PDT
(In reply to
comment #5
)
> (In reply to
comment #4
) > > Hi Eric, thanks for having a look. I'll try to partially answer some of your comments here. > > > > (In reply to
comment #3
) > > > (From update of
attachment 111104
[details]
[details] [details]) > > > View in context:
https://bugs.webkit.org/attachment.cgi?id=111104&action=review
> > > > > > > Source/WebCore/html/HTMLMediaElement.cpp:3019 > > > > void HTMLMediaElement::setAudioSourceNode(MediaElementAudioSourceNode* sourceNode) > > > > { > > > > m_audioSourceNode = sourceNode; > > > > + > > > > + if (m_player) > > > > + m_player->interceptAudioStream(m_audioSourceNode); > > > > } > > > > > > I am not wild about the name "interceptAudioStream". Why do HTMLMediaElement and MediaPlayerPrivate have different names? > > > > I'm not too attached to any names here. Would you suggest that I use "setAudioSourceNode" also for MediaPlayer/MediaPlayerPrivate instead > > of "interceptAudioStream"? > > > Yes, I think it makes sense to use the same names.
Ahhh, now I remember why I didn't make a setAudioSourceNode(MediaElementAudioSourceNode*) method on MediaPlayer -- it was because MediaPlayer and MediaPlayerPrivate are in WebCore/platform and aren't supposed to access classes (like MediaElementAudioSourceNode) which are defined in WebCore. Instead, I tried a more general API which is using a WebCore/platform class AudioSourceProvider::Client. It's a layering rule I thought I heard mentioned by Darin Adler a while back. I'm not sure if this layering issue is important or not, and would like your guidance here with setAudioSourceNode(MediaElementAudioSourceNode*) as opposed to something like interceptAudioStream() (or a better named version) Thanks
Eric Carlson
Comment 8
2011-10-19 13:18:32 PDT
(In reply to
comment #7
)
> > Ahhh, now I remember why I didn't make a setAudioSourceNode(MediaElementAudioSourceNode*) method on MediaPlayer -- it was because MediaPlayer and MediaPlayerPrivate > are in WebCore/platform and aren't supposed to access classes (like MediaElementAudioSourceNode) which are defined in WebCore. Instead, I tried a more general API which is using a WebCore/platform class AudioSourceProvider::Client. It's a layering rule I thought I heard mentioned by Darin Adler a while back. > > I'm not sure if this layering issue is important or not, and would like your guidance here with setAudioSourceNode(MediaElementAudioSourceNode*) as opposed to something like interceptAudioStream() (or a better named version) >
Getting the layering right is important. My point was that "interceptAudioStream" didn't help me understand what the method did, and that as long as the HTMediaElement method just passed the parameter through: void HTMLMediaElement::setAudioSourceNode(MediaElementAudioSourceNode* sourceNode) { m_audioSourceNode = sourceNode; if (m_player) m_player->interceptAudioStream(m_audioSourceNode); } we might as well use the same name. If you pass a different object the player method name shouldn't be the same, but I still don't think "interceptAudioStream" is right.
Eric Carlson
Comment 9
2011-10-19 13:22:40 PDT
FYI, see
https://bugs.webkit.org/show_bug.cgi?id=49192
for a discussion of automatically detecting and rejecting layering violations.
Chris Rogers
Comment 10
2011-10-19 14:03:35 PDT
(In reply to
comment #8
)
> (In reply to
comment #7
) > > > > Ahhh, now I remember why I didn't make a setAudioSourceNode(MediaElementAudioSourceNode*) method on MediaPlayer -- it was because MediaPlayer and MediaPlayerPrivate > > are in WebCore/platform and aren't supposed to access classes (like MediaElementAudioSourceNode) which are defined in WebCore. Instead, I tried a more general API which is using a WebCore/platform class AudioSourceProvider::Client. It's a layering rule I thought I heard mentioned by Darin Adler a while back. > > > > I'm not sure if this layering issue is important or not, and would like your guidance here with setAudioSourceNode(MediaElementAudioSourceNode*) as opposed to something like interceptAudioStream() (or a better named version) > > > Getting the layering right is important. My point was that "interceptAudioStream" didn't help me understand what the method did, and that as long as the HTMediaElement method just passed the parameter through: > > void HTMLMediaElement::setAudioSourceNode(MediaElementAudioSourceNode* sourceNode) > { > m_audioSourceNode = sourceNode; > > if (m_player) > m_player->interceptAudioStream(m_audioSourceNode); > } > > we might as well use the same name. > > If you pass a different object the player method name shouldn't be the same, but I still don't think "interceptAudioStream" is right.
If the layering is important, then I think we'll not be able to use MediaElementAudioSourceNode directly in a MediaPlayer method (since it's not part of "platform") The abstraction I've made in WebCore/platform for a similar (but more general) type of object is AudioSourceProvider::Client What do you think of the name: void MediaPlayer::setAudioSourceProviderClient(AudioSourceProvider::Client*); or possibly even better/simpler: void MediaPlayer::setAudioSource(AudioSourceProvider::Client*); or instead of defining the class as AudioSourceProvider::Client, define an "AudioSource" class, then we have: void MediaPlayer::setAudioSource(AudioSource*); My preference is the 2nd option: void MediaPlayer::setAudioSource(AudioSourceProvider::Client*); What do you think?
Eric Carlson
Comment 11
2011-10-20 07:07:05 PDT
(In reply to
comment #10
)
> My preference is the 2nd option: > > void MediaPlayer::setAudioSource(AudioSourceProvider::Client*); > > What do you think?
> This seems fine.
Chris Rogers
Comment 12
2011-10-21 12:51:36 PDT
Created
attachment 112003
[details]
Patch
Chris Rogers
Comment 13
2011-10-21 12:54:01 PDT
Thanks Eric, I think this latest patch addresses your comments. Naming-wise, I've changed interceptAudioStream() to setAudioSource().
Daniel Bates
Comment 14
2011-10-21 14:49:39 PDT
Comment on
attachment 112003
[details]
Patch
Attachment 112003
[details]
did not pass mac-ews (mac): Output:
http://queues.webkit.org/results/10197067
Chris Rogers
Comment 15
2011-10-24 17:00:47 PDT
Created
attachment 112278
[details]
Patch
Chris Rogers
Comment 16
2011-10-24 17:04:48 PDT
Eric, this latest patch has some simplifications that Darin Fisher recommended in an offline discussion. The changes are: * Break out AudioSourceProviderClient as a separate class (and header file). Darin suggests this because then we can forward declare AudioSourceProviderClient (can't do that with nested class) * Simplified MediaPlayer so there's no additional "setAudioSource()" method. Instead a "setClient()" method is added to AudioSourceProvider which accomplishes the same thing
WebKit Review Bot
Comment 17
2011-10-24 17:10:11 PDT
Comment on
attachment 112278
[details]
Patch
Attachment 112278
[details]
did not pass chromium-ews (chromium-xvfb): Output:
http://queues.webkit.org/results/10208114
Daniel Bates
Comment 18
2011-10-24 22:30:00 PDT
Comment on
attachment 112278
[details]
Patch
Attachment 112278
[details]
did not pass mac-ews (mac): Output:
http://queues.webkit.org/results/10210113
Eric Carlson
Comment 19
2011-10-25 09:09:11 PDT
Comment on
attachment 112278
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=112278&action=review
Looks like HTMLMediaElement.cpp needs to include AudioSourceProviderClient.h.
> Source/WebCore/html/HTMLMediaElement.cpp:3008 > + // If we're creating the player from scratch, make sure its AudioSourceProvider knows about the MediaElementAudioSourceNode. > + if (audioSourceProvider()) > + audioSourceProvider()->setClient(m_audioSourceNode);
Nit: this comment doesn't make sense, how does "if (audioSourceProvider())" tell you if the player was created from scratch?
> Source/WebCore/platform/audio/AudioSourceProvider.h:43 > + // If a client is set, we call it back when the audio format is available.
Nit: "when the audio format is available *or changes*"?
> Source/WebCore/webaudio/MediaElementAudioSourceNode.cpp:63 > +void MediaElementAudioSourceNode::setFormat(size_t, float) > +{ > + // FIXME: setup a sample-rate converter if necessary to convert to the AudioContext sample-rate. > +}
It is probably worth having an ASSERT here as long as the sample-rate converter setup is missing.
Chris Rogers
Comment 20
2011-10-26 12:58:49 PDT
Committed
r98512
: <
http://trac.webkit.org/changeset/98512
>
Adam Roben (:aroben)
Comment 21
2011-10-27 09:42:06 PDT
Windows build fix in
r98592
Adam Roben (:aroben)
Comment 22
2011-10-27 09:42:19 PDT
Not sure why the Windows EWS didn't catch this!
Adam Roben (:aroben)
Comment 23
2011-10-27 09:42:58 PDT
Ah, the committed patch was not the same as the one that went through the EWS.
Note
You need to
log in
before you can comment on or make changes to this bug.
Top of Page
Format For Printing
XML
Clone This Bug