Configure MockRealtimeAudioSourceMac to generate stereo audio
Created attachment 300992 [details] Patch
rdar://problem/30401723
Comment on attachment 300992 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=300992&action=review > Source/WebCore/platform/audio/mac/AudioSampleBufferList.cpp:53 > + m_bufferList = std::make_unique<WebAudioBufferList>(format, m_maxBufferSizePerChannel); This is unnecessary, te call to reset() below will do it. > Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm:207 > + OSStatus err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, nullptr, m_list->list(), m_listBufferSize, kCFAllocatorSystemDefault, kCFAllocatorSystemDefault, kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, &buffer); Isn't this already done by the call to WebAudioBufferList:: WebAudioBufferList(const CAAudioStreamDescription&, CMSampleBufferRef)?
Comment on attachment 300992 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=300992&action=review >> Source/WebCore/platform/audio/mac/AudioSampleBufferList.cpp:53 >> + m_bufferList = std::make_unique<WebAudioBufferList>(format, m_maxBufferSizePerChannel); > > This is unnecessary, te call to reset() below will do it. Indeed! >> Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm:207 >> + OSStatus err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, nullptr, m_list->list(), m_listBufferSize, kCFAllocatorSystemDefault, kCFAllocatorSystemDefault, kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, &buffer); > > Isn't this already done by the call to WebAudioBufferList:: WebAudioBufferList(const CAAudioStreamDescription&, CMSampleBufferRef)? You're right! > Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:143 > m_streamFormat.mSampleRate = m_sampleRate; > m_streamFormat.mFormatID = kAudioFormatLinearPCM; > - m_streamFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked; > - m_streamFormat.mBytesPerPacket = bytesPerFloat * channelCount; > + m_streamFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked | kAudioFormatFlagIsNonInterleaved; > + m_streamFormat.mBytesPerPacket = bytesPerFloat; > m_streamFormat.mFramesPerPacket = 1; > - m_streamFormat.mBytesPerFrame = bytesPerFloat * channelCount; > + m_streamFormat.mBytesPerFrame = bytesPerFloat; > m_streamFormat.mChannelsPerFrame = channelCount; > m_streamFormat.mBitsPerChannel = bitsPerByte * bytesPerFloat; It also looks like this can all be replaced by a call to FillOutASBDForLPCM(). Will change before landing.
Created attachment 301014 [details] Patch for landing
Created attachment 301016 [details] Patch for landing
Comment on attachment 301016 [details] Patch for landing View in context: https://bugs.webkit.org/attachment.cgi?id=301016&action=review > Source/WebCore/platform/mediastream/RealtimeMediaSource.h:113 > + void audioSamplesAvailable(const MediaTime&, PlatformAudioData&, const AudioStreamDescription&, size_t); It is not very clear to me why this takes a PlatformAudioData&. I would expect either a const PlatformAudioData& or a PlatformAudioData&& as parameter. Also, is the numberOfFrames a parameter on its own, or should it be part of PlatformAudioData?
Comment on attachment 301016 [details] Patch for landing Clearing flags on attachment: 301016 Committed r211959: <http://trac.webkit.org/changeset/211959>
(In reply to comment #7) > Comment on attachment 301016 [details] > Patch for landing > > View in context: > https://bugs.webkit.org/attachment.cgi?id=301016&action=review > > > Source/WebCore/platform/mediastream/RealtimeMediaSource.h:113 > > + void audioSamplesAvailable(const MediaTime&, PlatformAudioData&, const AudioStreamDescription&, size_t); > > It is not very clear to me why this takes a PlatformAudioData&. > I would expect either a const PlatformAudioData& or a PlatformAudioData&& as > parameter. It's not a && because the calling class retains ownership of the object. It's not const because we'd have to cast away the constness when accessing the wrapped ABL. > Also, is the numberOfFrames a parameter on its own, or should it be part of > PlatformAudioData? The PlatformAudioData has a capacity, but not a numberOfFrames. I don't think we want to tie together the concept of "please copy this number of frames out of this audio data" with the audio data structure itself.
(In reply to comment #9) > It's not const because we'd have to cast away the constness when accessing > the wrapped ABL. So it looks like you can make this entire path const-correct, but it's a significant code change, deserving of its own patch.
(In reply to comment #10) > (In reply to comment #9) > > It's not const because we'd have to cast away the constness when accessing > > the wrapped ABL. > > So it looks like you can make this entire path const-correct, but it's a > significant code change, deserving of its own patch. See <https://bugs.webkit.org/show_bug.cgi?id=168051>
It seems this landed: https://github.com/WebKit/WebKit/commit/c71329a506449e928ab40cc4a0496260f6881336 and didn't backed out and follow-up bug also corrected the other bits. Marking this as "RESOLVED FIXED".