<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<!DOCTYPE bugzilla SYSTEM "https://bugs.webkit.org/page.cgi?id=bugzilla.dtd">

<bugzilla version="5.0.4.1"
          urlbase="https://bugs.webkit.org/"
          
          maintainer="admin@webkit.org"
>

    <bug>
          <bug_id>179363</bug_id>
          
          <creation_ts>2017-11-06 23:16:54 -0800</creation_ts>
          <short_desc>iOS calling getUserMedia() again kills video display of first getUserMedia()</short_desc>
          <delta_ts>2022-11-07 16:48:30 -0800</delta_ts>
          <reporter_accessible>1</reporter_accessible>
          <cclist_accessible>1</cclist_accessible>
          <classification_id>1</classification_id>
          <classification>Unclassified</classification>
          <product>WebKit</product>
          <component>WebRTC</component>
          <version>Safari 11</version>
          <rep_platform>iPhone / iPad</rep_platform>
          <op_sys>iOS 11</op_sys>
          <bug_status>RESOLVED</bug_status>
          <resolution>CONFIGURATION CHANGED</resolution>
          
          <see_also>https://bugs.webkit.org/show_bug.cgi?id=238492</see_also>
          <bug_file_loc></bug_file_loc>
          <status_whiteboard></status_whiteboard>
          <keywords>InRadar</keywords>
          <priority>P2</priority>
          <bug_severity>Normal</bug_severity>
          <target_milestone>---</target_milestone>
          
          
          <everconfirmed>1</everconfirmed>
          <reporter name="Chad Phillips">webkit</reporter>
          <assigned_to name="Nobody">webkit-unassigned</assigned_to>
          <cc>abarnes</cc>
    
    <cc>bfulgham</cc>
    
    <cc>daginge</cc>
    
    <cc>davy.de.durpel</cc>
    
    <cc>eric.amram</cc>
    
    <cc>eric.carlson</cc>
    
    <cc>gabor</cc>
    
    <cc>jaya.allamsetty</cc>
    
    <cc>jonlee</cc>
    
    <cc>joseph.eugene</cc>
    
    <cc>lee</cc>
    
    <cc>mb</cc>
    
    <cc>mbrice</cc>
    
    <cc>mitch</cc>
    
    <cc>oanguenot</cc>
    
    <cc>ramyaddurairaj</cc>
    
    <cc>relbeatsdev</cc>
    
    <cc>sak126p</cc>
    
    <cc>seba.kerckhof</cc>
    
    <cc>selawre</cc>
    
    <cc>shreyas</cc>
    
    <cc>ssim</cc>
    
    <cc>webkit</cc>
    
    <cc>webkit-bug-importer</cc>
    
    <cc>youennf</cc>
          

      

      

      

          <comment_sort_order>oldest_to_newest</comment_sort_order>  
          <long_desc isprivate="0" >
    <commentid>1368940</commentid>
    <comment_count>0</comment_count>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2017-11-06 23:16:54 -0800</bug_when>
    <thetext>Verified by the reference code below:

On iOS, a second call to getUserMedia() kills the display of a video stream obtained by an earlier call to getUserMedia(). The original stream displays fine until the subsequent getUserMedia() call, then goes black.

Note that this doesn&apos;t happen on Desktop Safari, only on iOS Safari in my tests.

Reference code:

&lt;!DOCTYPE html&gt;
&lt;html&gt;
  &lt;body&gt;
    &lt;div&gt;
      &lt;video id=&quot;video1&quot; autoplay playsinline&gt;&lt;/video&gt;
    &lt;/div&gt;
    &lt;script type=&quot;text/javascript&quot;&gt;
      var constraints1 = {
        audio: false,
        video: {
          height: {
            max: 480,
          },
          width: {
            max: 640,
          },
        },
      };
      navigator.mediaDevices.getUserMedia(constraints1).then(function(stream) {
        var video1 = document.getElementById(&apos;video1&apos;);
        video1.srcObject = stream;
      }).catch(function(err) {
        console.error(&quot;Device access checks failed: &quot;, err, constraints1);
      });
      var constraints2 = {
        audio: false,
        video: true,
      };
      navigator.mediaDevices.getUserMedia(constraints2).then(function(stream) {
      }).catch(function(err) {
        console.error(&quot;Device access checks failed: &quot;, err, constraints2);
      });
    &lt;/script&gt;
  &lt;/body&gt;
&lt;/html&gt;</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1369232</commentid>
    <comment_count>1</comment_count>
      <attachid>326267</attachid>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2017-11-07 14:58:03 -0800</bug_when>
    <thetext>Created attachment 326267
Multiple getUserMedia() streams controlled by UI.

Attaching a more robust test case, allowing user triggering of multiple streams obtained by getUserMedia().

Both &apos;Video 1&apos; and &apos;Video 2&apos; can be started and stopped fine as long as the other is not actively streaming. But if you start one while another is streaming to screen, the first stream goes black.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1374326</commentid>
    <comment_count>2</comment_count>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2017-11-22 20:05:49 -0800</bug_when>
    <thetext>I&apos;ve spent some more time digging into this issue, and it turns out that the video MediaStreamTrack element of video 1 has its &apos;mute&apos; property set to true upon another gUM call that requests a video stream.

It&apos;s not even necessary for this gUM call to do anything with the video stream (like display it) for the muting of the previous video MediaStreamTrack element.

Furthermore, I see no way via the API to unmute the muted video track -- the &apos;mute&apos; property is read-only, and toggling the &apos;enabled&apos; property of either video track has no effect on its state.

Is this issue related to the &apos;Multiple Simultaneous Audio or Video Streams&apos; as noted at https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html ?

If so, it&apos;s going to be severely limiting for certain multiparty videoconferencing applications. For example:

 - It&apos;s common practice to show a user their own local video feed with one (higher resolution) stream, and publish another (lower resolution) stream to other users

 - To accommodate receivers with different bandwidth capabilities, a common practice is to publish both a high resolution and a low resolution stream</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1417088</commentid>
    <comment_count>3</comment_count>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2018-04-23 21:14:27 -0700</bug_when>
    <thetext>This is still an issue with iOS 11.3, would love somebody to have a look at it.

IMO, it&apos;s unnecessarily limiting to block video after a user has granted access to the camera, as mentioned in the specific use cases in previous comments.

Limitations like this make it so that the in-browser WebRTC experience on iOS is the worst of any platform -- is that really what Apple wants??</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1437302</commentid>
    <comment_count>4</comment_count>
    <who name="Dag-Inge Aas">daginge</who>
    <bug_when>2018-06-28 01:16:35 -0700</bug_when>
    <thetext>Just chiming in here. This is a blocker for us in switching between front and back camera on Safari iOS. At least with minimal disruption to the user experience.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1437835</commentid>
    <comment_count>5</comment_count>
    <who name="youenn fablet">youennf</who>
    <bug_when>2018-06-29 11:01:08 -0700</bug_when>
    <thetext>&gt;  - It&apos;s common practice to show a user their own local video feed with one
&gt; (higher resolution) stream, and publish another (lower resolution) stream to
&gt; other users

Understood that this is not optimal, although in most UI, the local video track usually takes a smaller part of the screen than the remote video track.
Also applyConstraints should be the solution for changing the resolution of a given video track.

At this point, it is not feasible to have two tracks with different resolutions. Ideally, this should be feasible using MediaStream cloning and applyConstraints.

Note that general support for multiple video streams might not always be feasible, in particular if the streams are coming from multiple capture devices.

&gt;  - To accommodate receivers with different bandwidth capabilities, a common
&gt; practice is to publish both a high resolution and a low resolution stream

Simulcast might be a better option there.

(In reply to daginge from comment #4)
&gt; Just chiming in here. This is a blocker for us in switching between front
&gt; and back camera on Safari iOS. At least with minimal disruption to the user
&gt; experience.

I would be interested in what disruption you encounter.
The following is expected to work without too much trouble:

navigator.mediaDevices.getUserMedia({video: { facingMode: &quot;user&quot;} }).then((s) =&gt; {
   localVideo.srcObject = s;
   peerConnection.getSenders()[0].replaceTrack(s.getVideoTracks()[0]);
});</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1438234</commentid>
    <comment_count>6</comment_count>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2018-06-30 17:23:24 -0700</bug_when>
    <thetext>&gt; Understood that this is not optimal, although in most UI,
&gt; the local video track usually takes a smaller part of the
&gt; screen than the remote video track.

It&apos;s not the only rational UI choice though. I have a layout where all feeds, including the local user&apos;s feed, are the same size, and many users prefer this layout. Also, it doesn&apos;t seem like a very flexible architectural mindset to make that kind of assumption about how a designer wants to lay things out?

&gt; Note that general support for multiple video streams might not
&gt; always be feasible, in particular if the streams are coming from
&gt; multiple capture devices.

I want to point out here that Chrome on Android has zero restrictions/limitations in this regard. You can call gUM multiple times, grab different resolutions, clone streams, etc., and it all works flawlessly. By comparison, iOS is a nightmare to work on for anything beyond the most basic use cases. Which also makes the end user experience worse on iOS because of all the compromises necessary. It&apos;s puzzling to me Apple implemented WebRTC at all if they&apos;re going to hamstring it.

&gt; Simulcast might be a better option there.

Not all clients support simulcast. For example, Chrome doesn&apos;t yet support it for h264, which is of course the required codec if you want interop with iOS devices.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1455743</commentid>
    <comment_count>7</comment_count>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2018-08-31 10:24:29 -0700</bug_when>
    <thetext>Adding some further clarification from more testing:

1. This issue only occurs when a subsequent gUM() request asks for an already requested media type. For example, if gUM() #1 asks for video, and gUM() #2 also asks for video, gUM() #1&apos;s video stream is affected. However, if gUM() #2 only asks for audio, then gUM() #1&apos;s video stream is NOT affected.

2. Because the mechanism is setting the track&apos;s muted property, data is still sent along a peer connection, although it&apos;s not very useful since the other side only receives muted video.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1455758</commentid>
    <comment_count>8</comment_count>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2018-08-31 10:39:21 -0700</bug_when>
    <thetext>This issue also occurs for audio tracks.

I now believe the issue can be fully summarized as: If a getUserMedia() requests a media type requested in a previous getUserMedia(), the previously requested media track&apos;s &apos;muted&apos; property is set to true, and there is no way to programmatically unmute it.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1455766</commentid>
    <comment_count>9</comment_count>
    <who name="youenn fablet">youennf</who>
    <bug_when>2018-08-31 10:57:37 -0700</bug_when>
    <thetext>This is currently an expected behavior.
There seems to be two requests:
- Allow multiple capture using the same capture device with different parameters (resolution, frameRate...).
- Allow capture on two different capture devices at the same time.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1455786</commentid>
    <comment_count>10</comment_count>
    <who name="Chad Phillips">webkit</who>
    <bug_when>2018-08-31 11:35:00 -0700</bug_when>
    <thetext>@youenn, those seem correct, and are supported by every other platform I&apos;ve tried.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1455799</commentid>
    <comment_count>11</comment_count>
    <who name="Radar WebKit Bug Importer">webkit-bug-importer</who>
    <bug_when>2018-08-31 12:11:49 -0700</bug_when>
    <thetext>&lt;rdar://problem/43950488&gt;</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1489203</commentid>
    <comment_count>12</comment_count>
    <who name="Ramya D">ramyaddurairaj</who>
    <bug_when>2018-12-18 21:20:31 -0800</bug_when>
    <thetext>Whether it is fixed or not. I&apos;m also facing the same issue @Chad Phillips</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1489270</commentid>
    <comment_count>13</comment_count>
    <who name="Ramya D">ramyaddurairaj</who>
    <bug_when>2018-12-19 03:46:11 -0800</bug_when>
    <thetext>Please make this issue as a major one and why this is still not assigned.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1489316</commentid>
    <comment_count>14</comment_count>
    <who name="youenn fablet">youennf</who>
    <bug_when>2018-12-19 08:27:19 -0800</bug_when>
    <thetext>*** Bug 192849 has been marked as a duplicate of this bug. ***</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1565432</commentid>
    <comment_count>15</comment_count>
    <who name="Shreyas">shreyas</who>
    <bug_when>2019-08-27 11:36:38 -0700</bug_when>
    <thetext>Are there any updates on this? I am working on an implementation where I need two videos in the same window. I tried MediaStream.clone() but it has its own issues. It would be great to have this feature.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1591000</commentid>
    <comment_count>16</comment_count>
    <who name="youenn fablet">youennf</who>
    <bug_when>2019-11-18 00:51:32 -0800</bug_when>
    <thetext>(In reply to Shreyas from comment #15)
&gt; Are there any updates on this? I am working on an implementation where I
&gt; need two videos in the same window. I tried MediaStream.clone() but it has
&gt; its own issues. It would be great to have this feature.

What are the issues you encountered?
cloning and applyConstraints should allow you to have two tracks with the same source and with different sizes.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1595112</commentid>
    <comment_count>17</comment_count>
    <who name="Shreyas">shreyas</who>
    <bug_when>2019-12-03 12:12:00 -0800</bug_when>
    <thetext>I resolved this by another approach but here&apos;s what I was facing:

With clone(), When I try to apply constraints on a new video, it also applies to the original video which did not fit my solution.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1670051</commentid>
    <comment_count>18</comment_count>
    <who name="seba kerckhof">seba.kerckhof</who>
    <bug_when>2020-07-09 02:19:47 -0700</bug_when>
    <thetext>Can we ever expect to see this fixed?
Sure, we can usually modify our code to work around it, but it doesn&apos;t mean it&apos;s not another bug in an already very buggy webRTC implementation offered by webkit.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1687542</commentid>
    <comment_count>19</comment_count>
    <who name="Jaya Allamsetty">jaya.allamsetty</who>
    <bug_when>2020-09-10 14:42:51 -0700</bug_when>
    <thetext>Are there any updates on this one ? Jitsi users are also facing this issue with both audio and video when switching camera because of how device selection UI is designed. We do a second gUM call for previewing the tracks.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1712697</commentid>
    <comment_count>20</comment_count>
    <who name="Maximilian Böhm">mb</who>
    <bug_when>2020-12-08 00:32:30 -0800</bug_when>
    <thetext>So just to be sure, it is currently not supported to &quot;Allow capture on two different capture devices at the same time&quot;, right?

I suppose it is not planned to change this behaviour as this report is on &quot;NEW&quot; since 2017?

We would have a valid use case: One should be seen while he or she is presenting physical stuff. Think of a car show, you should see the one who is trying to sell as well as the car/interior/exterior.

This is a highly asked feature, especially during covid...</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1774003</commentid>
    <comment_count>21</comment_count>
    <who name="Olivier Anguenot">oanguenot</who>
    <bug_when>2021-06-30 23:14:42 -0700</bug_when>
    <thetext>Had the same problem with audio only tracks.

In my use case, I create several PeerConnections to connect to several audio rooms in FreeSwitch using Janus as a WebRTC Gateway.

I faced the same issue: my first line was muted as soon as the second was created.

Additionally to that, the complexity was that the WebRTC part is mainly managed by Janus and the SIP plugin... But hopefully, there is a way to access to the PeerConnection object from Janus.

The solution I found is a workaround and fixing that issue will for sure help to handle the case correctly.

1/ I added an handler on the &quot;onmute&quot; event on the local track associated to each room connected.
2/ When one of my local track goes to &quot;muted&quot;, I try to find in the other existing local tracks, a track which is not muted
3/ I clone that track
4/ I use the PeerConnection -&gt; Sender -&gt; replaceTrack() function 

&quot;Magically&quot;, it seems to work. I&apos;m able to speak and be heard in the room associated to that track.

But, not sure on what is the consequence on that (listeners for example...)</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1784081</commentid>
    <comment_count>22</comment_count>
    <who name="Trace">relbeatsdev</who>
    <bug_when>2021-08-13 04:58:33 -0700</bug_when>
    <thetext>13-08-2021: Facing the same issue with Jitsi. Hope it will be fixed soon!</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1787094</commentid>
    <comment_count>23</comment_count>
    <who name="awe.media">webkit</who>
    <bug_when>2021-08-24 23:26:41 -0700</bug_when>
    <thetext>BUMP!

Adding streams from multiple cameras (e.g. &apos;environment&apos; and &apos;user&apos; facingMode) would be very useful and works fine in Chrome on Android.

This bug is nearly 4 years old now and still marked as NEW 8(</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1798601</commentid>
    <comment_count>24</comment_count>
    <who name="Austin">abarnes</who>
    <bug_when>2021-09-29 15:16:56 -0700</bug_when>
    <thetext>Currently seeing this still in my company&apos;s application. Any updates on getting a fix?</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1839068</commentid>
    <comment_count>25</comment_count>
    <who name="">sak126p</who>
    <bug_when>2022-02-09 02:44:49 -0800</bug_when>
    <thetext>any news? problem still occurs on safari 15.3</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1840940</commentid>
    <comment_count>26</comment_count>
    <who name="Brent Fulgham">bfulgham</who>
    <bug_when>2022-02-12 21:48:30 -0800</bug_when>
    <thetext>This is actually:
&lt;rdar://42467754&gt;</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1841945</commentid>
    <comment_count>27</comment_count>
    <who name="Mitch Talmadge">mitch</who>
    <bug_when>2022-02-15 14:23:31 -0800</bug_when>
    <thetext>Adding my voice to this. This is really frustrating for our iOS users in our meeting software. Please dedicate time to get this fixed.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1842309</commentid>
    <comment_count>28</comment_count>
    <who name="Mitch Talmadge">mitch</who>
    <bug_when>2022-02-16 08:59:38 -0800</bug_when>
    <thetext>I&apos;ve also found that using the SpeechRecognition API (https://developer.mozilla.org/en-US/docs/Web/API/SpeechRecognition) will cause the microphone track to become muted everywhere else, including in WebRTC calls. 

Due to this bug, it is impossible to generate captions using this API in our meeting software without making the user go silent for everyone else. We had to disable caption generation for this reason.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1855460</commentid>
    <comment_count>29</comment_count>
    <who name="Eugene M. Joseph">joseph.eugene</who>
    <bug_when>2022-03-27 16:15:57 -0700</bug_when>
    <thetext>Seconding everything here. 

Unable to locally capture an audio WebRTC stream via the MediaRecorder on iOS. The originally WebRTC fails to work when the MediaRecorder instance is started.

Would be very useful to work with simultaneous media streams.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1855593</commentid>
    <comment_count>30</comment_count>
    <who name="youenn fablet">youennf</who>
    <bug_when>2022-03-28 07:14:54 -0700</bug_when>
    <thetext>Bug 237359 fixes the case of muting a video track when another video track is created on iOS in https://commits.webkit.org/r291034.

The new behavior in iOS is that, if a new camera device is used, we will stop the previous capture and start the new one as the current API we are using does not allow to capture both at the same time. If the new capture is using the same camera as the previous one, both will be able to continue in parallel.

If your particular use case is about capturing with both cameras, please file a new bug.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1855595</commentid>
    <comment_count>31</comment_count>
    <who name="youenn fablet">youennf</who>
    <bug_when>2022-03-28 07:16:27 -0700</bug_when>
    <thetext>(In reply to Austin from comment #24)
&gt; Currently seeing this still in my company&apos;s application. Any updates on
&gt; getting a fix?

The current workaround until the above fix rolls out is to clone the track instead of calling getUserMedia.
This allows to get the same video feed in two tracks, and potentially apply width/height/frame rate constraints independently.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1855933</commentid>
    <comment_count>32</comment_count>
    <who name="awe.media">webkit</who>
    <bug_when>2022-03-29 00:26:08 -0700</bug_when>
    <thetext>(In reply to youenn fablet from comment #30)
&gt; Bug 237359 fixes the case of muting a video track when another video track
&gt; is created on iOS in https://commits.webkit.org/r291034.
&gt; 
&gt; The new behavior in iOS is that, if a new camera device is used, we will
&gt; stop the previous capture and start the new one as the current API we are
&gt; using does not allow to capture both at the same time. If the new capture is
&gt; using the same camera as the previous one, both will be able to continue in
&gt; parallel.
&gt; 
&gt; If your particular use case is about capturing with both cameras, please
&gt; file a new bug.

In case this wasn&apos;t automatically linked back - see https://bugs.webkit.org/show_bug.cgi?id=238492

Thanks.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1910919</commentid>
    <comment_count>33</comment_count>
    <who name="Sean">selawre</who>
    <bug_when>2022-11-07 16:20:47 -0800</bug_when>
    <thetext>&gt; If the new capture is using the same camera as the previous one, both will be able to continue in parallel.

I still get a black screen (after 3 secs) on the second call to getUserMedia() on iOS in this example where the same device is returned:

&lt;!DOCTYPE html&gt;
&lt;html&gt;
  &lt;body&gt;
    &lt;div&gt;
      &lt;video id=&quot;video1&quot; autoplay playsinline&gt;&lt;/video&gt;
    &lt;/div&gt;
    &lt;script type=&quot;text/javascript&quot;&gt;
      
      (async function () { 
        var constraints = {
          audio: false,
          video: true,
        };
        const result1 = await navigator.mediaDevices.getUserMedia(constraints).then(async function(stream) {
          var video1 = document.getElementById(&apos;video1&apos;);
          video1.srcObject = stream;
          return stream;
        });
        console.log(result1.getTracks()[0].getCapabilities().deviceId);

        window.setTimeout(async function () {
          const result2 = await navigator.mediaDevices.getUserMedia(constraints);
          console.log(result2.getTracks()[0].getCapabilities().deviceId);
      }, 3000);
      }());
    &lt;/script&gt;
  &lt;/body&gt;
&lt;/html&gt;


Is this expected?</thetext>
  </long_desc>
      
          <attachment
              isobsolete="0"
              ispatch="0"
              isprivate="0"
          >
            <attachid>326267</attachid>
            <date>2017-11-07 14:58:03 -0800</date>
            <delta_ts>2017-11-07 14:58:03 -0800</delta_ts>
            <desc>Multiple getUserMedia() streams controlled by UI.</desc>
            <filename>ui-multiple-streams.html</filename>
            <type>text/html</type>
            <size>3487</size>
            <attacher name="Chad Phillips">webkit</attacher>
            
              <data encoding="base64">PCFET0NUWVBFIGh0bWw+CjxodG1sPgogIDxoZWFkPgogICAgPG1ldGEgY2hhcnNldD0iVVRGLTgi
PgogICAgPHRpdGxlPk11bHRpcGxlIHN0cmVhbXM8L3RpdGxlPgogIDwvaGVhZD4KICA8Ym9keT4K
CiAgICA8ZGl2PgogICAgICA8YnV0dG9uIGlkPSJ2aWRlbzEtc3RhcnQtc3RvcCI+VmlkZW8gMTwv
YnV0dG9uPgogICAgPC9kaXY+CiAgICA8ZGl2PgogICAgICA8dmlkZW8gaWQ9InZpZGVvMSIgYXV0
b3BsYXkgcGxheXNpbmxpbmU+PC92aWRlbz4KICAgIDwvZGl2PgogICAgPGRpdj4KICAgICAgPGJ1
dHRvbiBpZD0idmlkZW8yLXN0YXJ0LXN0b3AiPlZpZGVvIDI8L2J1dHRvbj4KICAgIDwvZGl2Pgog
ICAgPGRpdj4KICAgICAgPHZpZGVvIGlkPSJ2aWRlbzIiIGF1dG9wbGF5IHBsYXlzaW5saW5lPjwv
dmlkZW8+CiAgICA8L2Rpdj4KCiAgICA8c2NyaXB0IHR5cGU9InRleHQvamF2YXNjcmlwdCI+CgoK
ICAgICAgdmFyIHZpZGVvMV9zdGFydGVkID0gZmFsc2U7CiAgICAgIHZhciB2aWRlbzEgPSBkb2N1
bWVudC5nZXRFbGVtZW50QnlJZCgndmlkZW8xJyk7CiAgICAgIHZhciB2aWRlbzFfYnV0dG9uID0g
ZG9jdW1lbnQuZ2V0RWxlbWVudEJ5SWQoJ3ZpZGVvMS1zdGFydC1zdG9wJyk7CiAgICAgIHZhciBz
dGFydFN0b3BWaWRlbzEgPSBmdW5jdGlvbihlKSB7CiAgICAgICAgaWYgKHZpZGVvMV9zdGFydGVk
KSB7CiAgICAgICAgICB2aWRlbzEuc3JjT2JqZWN0ICYmIHZpZGVvMS5zcmNPYmplY3QuZ2V0VHJh
Y2tzKCkuZm9yRWFjaChmdW5jdGlvbih0cmFjaykgewogICAgICAgICAgICB0cmFjay5zdG9wKCk7
CiAgICAgICAgICB9KTsKICAgICAgICAgIHZpZGVvMS5zcmNPYmplY3QgPSBudWxsOwogICAgICAg
ICAgdmlkZW8xX3N0YXJ0ZWQgPSBmYWxzZTsKICAgICAgICAgIGNvbnNvbGUubG9nKCJWaWRlbyAx
IHN0b3BwZWQiKTsKICAgICAgICB9CiAgICAgICAgZWxzZSB7CiAgICAgICAgICB2YXIgY29uc3Ry
YWludHMxID0gewogICAgICAgICAgICBhdWRpbzogZmFsc2UsCiAgICAgICAgICAgIHZpZGVvOiB7
CiAgICAgICAgICAgICAgaGVpZ2h0OiB7CiAgICAgICAgICAgICAgICBtYXg6IDQ4MCwKICAgICAg
ICAgICAgICB9LAogICAgICAgICAgICAgIHdpZHRoOiB7CiAgICAgICAgICAgICAgICBtYXg6IDY0
MCwKICAgICAgICAgICAgICB9LAogICAgICAgICAgICAgIGRldmljZUlkOiBkZXZpY2VfaWQsCiAg
ICAgICAgICAgIH0sCiAgICAgICAgICB9OwogICAgICAgICAgbmF2aWdhdG9yLm1lZGlhRGV2aWNl
cy5nZXRVc2VyTWVkaWEoY29uc3RyYWludHMxKS50aGVuKGZ1bmN0aW9uKHN0cmVhbSkgewogICAg
ICAgICAgICB2aWRlbzEuc3JjT2JqZWN0ID0gc3RyZWFtOwogICAgICAgICAgICB2aWRlbzFfc3Rh
cnRlZCA9IHRydWU7CiAgICAgICAgICAgIGNvbnNvbGUubG9nKCJWaWRlbyAxIHN0YXJ0ZWQiKTsK
ICAgICAgICAgIH0pLmNhdGNoKGZ1bmN0aW9uKGVycikgewogICAgICAgICAgICBjb25zb2xlLmVy
cm9yKCJEZXZpY2UgYWNjZXNzIGNoZWNrcyBmYWlsZWQ6ICIsIGVyciwgY29uc3RyYWludHMxKTsK
ICAgICAgICAgIH0pOwogICAgICAgIH0KICAgICAgfQoKICAgICAgdmFyIHZpZGVvMl9zdGFydGVk
ID0gZmFsc2U7CiAgICAgIHZhciB2aWRlbzIgPSBkb2N1bWVudC5nZXRFbGVtZW50QnlJZCgndmlk
ZW8yJyk7CiAgICAgIHZhciB2aWRlbzJfYnV0dG9uID0gZG9jdW1lbnQuZ2V0RWxlbWVudEJ5SWQo
J3ZpZGVvMi1zdGFydC1zdG9wJyk7CiAgICAgIHZhciBzdGFydFN0b3BWaWRlbzIgPSBmdW5jdGlv
bihlKSB7CiAgICAgICAgaWYgKHZpZGVvMl9zdGFydGVkKSB7CiAgICAgICAgICB2aWRlbzIuc3Jj
T2JqZWN0ICYmIHZpZGVvMi5zcmNPYmplY3QuZ2V0VHJhY2tzKCkuZm9yRWFjaChmdW5jdGlvbih0
cmFjaykgewogICAgICAgICAgICB0cmFjay5zdG9wKCk7CiAgICAgICAgICB9KTsKICAgICAgICAg
IHZpZGVvMi5zcmNPYmplY3QgPSBudWxsOwogICAgICAgICAgdmlkZW8yX3N0YXJ0ZWQgPSBmYWxz
ZTsKICAgICAgICAgIGNvbnNvbGUubG9nKCJWaWRlbyAyIHN0b3BwZWQiKTsKICAgICAgICB9CiAg
ICAgICAgZWxzZSB7CiAgICAgICAgICB2YXIgY29uc3RyYWludHMyID0gewogICAgICAgICAgICBh
dWRpbzogZmFsc2UsCiAgICAgICAgICAgIHZpZGVvOiB7CiAgICAgICAgICAgICAgaGVpZ2h0OiB7
CiAgICAgICAgICAgICAgICAvL21heDogMjQwLAogICAgICAgICAgICAgICAgbWF4OiA0ODAsCiAg
ICAgICAgICAgICAgfSwKICAgICAgICAgICAgICB3aWR0aDogewogICAgICAgICAgICAgICAgLy9t
YXg6IDMyMCwKICAgICAgICAgICAgICAgIG1heDogNjQwLAogICAgICAgICAgICAgIH0sCiAgICAg
ICAgICAgICAgZGV2aWNlSWQ6IGRldmljZV9pZCwKICAgICAgICAgICAgfSwKICAgICAgICAgIH07
CiAgICAgICAgICBuYXZpZ2F0b3IubWVkaWFEZXZpY2VzLmdldFVzZXJNZWRpYShjb25zdHJhaW50
czIpLnRoZW4oZnVuY3Rpb24oc3RyZWFtKSB7CiAgICAgICAgICAgIHZpZGVvMi5zcmNPYmplY3Qg
PSBzdHJlYW07CiAgICAgICAgICAgIHZpZGVvMl9zdGFydGVkID0gdHJ1ZTsKICAgICAgICAgICAg
Y29uc29sZS5sb2coIlZpZGVvIDIgc3RhcnRlZCIpOwogICAgICAgICAgfSkuY2F0Y2goZnVuY3Rp
b24oZXJyKSB7CiAgICAgICAgICAgIGNvbnNvbGUuZXJyb3IoIkRldmljZSBhY2Nlc3MgY2hlY2tz
IGZhaWxlZDogIiwgZXJyLCBjb25zdHJhaW50czIpOwogICAgICAgICAgfSk7CiAgICAgICAgfQog
ICAgICB9CgogICAgICB2YXIgZGV2aWNlX2lkID0gbnVsbDsKICAgICAgZnVuY3Rpb24gaGFuZGxl
RXJyb3IoZXJyb3IpIHsKICAgICAgICBjb25zb2xlLmxvZygnZGV2aWNlIGVudW1lcmF0aW9uIGVy
cm9yOiAnLCBlcnJvcik7CiAgICAgIH0KICAgICAgdmFyIHNldHVwVmlkZW8gPSBmdW5jdGlvbihk
ZXZpY2VzKSB7CiAgICAgICAgZm9yICh2YXIgaSA9IDA7IGkgIT09IGRldmljZXMubGVuZ3RoOyAr
K2kpIHsKICAgICAgICAgIHZhciBkZXZpY2UgPSBkZXZpY2VzW2ldOwogICAgICAgICAgaWYgKGRl
dmljZS5raW5kID09PSAndmlkZW9pbnB1dCcpIHsKICAgICAgICAgICAgZGV2aWNlX2lkID0gZGV2
aWNlLmRldmljZUlkOwogICAgICAgICAgICBjb25zb2xlLmxvZygiVXNpbmcgdmlkZW8gZGV2aWNl
OiAiICsgZGV2aWNlLmxhYmVsICsgIigiICsgZGV2aWNlX2lkICsgIikiKTsKICAgICAgICAgICAg
dmlkZW8xX2J1dHRvbi5hZGRFdmVudExpc3RlbmVyKCJjbGljayIsIHN0YXJ0U3RvcFZpZGVvMSk7
CiAgICAgICAgICAgIHZpZGVvMl9idXR0b24uYWRkRXZlbnRMaXN0ZW5lcigiY2xpY2siLCBzdGFy
dFN0b3BWaWRlbzIpOwogICAgICAgICAgICBicmVhazsKICAgICAgICAgIH0KICAgICAgICB9CiAg
ICAgIH0KICAgICAgbmF2aWdhdG9yLm1lZGlhRGV2aWNlcy5lbnVtZXJhdGVEZXZpY2VzKCkudGhl
bihzZXR1cFZpZGVvKS5jYXRjaChoYW5kbGVFcnJvcik7CgogICAgPC9zY3JpcHQ+CiAgPC9ib2R5
Pgo8L2h0bWw+Cg==
</data>

          </attachment>
      

    </bug>

</bugzilla>