Draw Offscreen Rendering Buffer to Another Window / Canvas


Is it possible to draw the nativeImage obtained through the offscreen rendering API to another window or alternatively a canvas in another window? Ideally this should be done 60 times a second …


Since you can getNativeWindowHandle, I think you can init your Graphic Device and render in region.


But you need to handle platform one by one, I didn’t have time to test this but I’m really curiously if some one have the experience on this.


Are you trying to render video into an Electron application? I tried using the getNativeWindowHandle to draw onto the canvas from a GStreamer image sink without any luck. There was some discussion and it seemed to be prevented by Chromium being hardware accelerated, which meant an outside process couldn’t draw on it.

@ilgiallomondadori had some luck drawing frames manually in this thread. Something like WebChimera might work, but it has a performance penalty due to having to do an extra memcpy to get the video into the necessary Electron process.


I could never get it to work, but I would love it if someone did and showed us all how. I didn’t have control over the rendering, just passing a window handle, but still, I’d love some insight into this.


I’m also still very curious. It seems like the right solution involves hooking into/before/around/some-other-word Chromium’s OpenGL draws to allow a native application to draw in the necessary region.

Maybe a crazier option is to create a frameless, transparent window. Then you could launch another video player directly underneath the electron app. This really feels like a huge hack, though.


Interesting discussion … For those wondering what I’d aiming to do—I am creating an image, an animation in a hidden, frameless with angular and would like to capture the RGBA data, to then output separate channels in another window. I have previously simply repeated the actual app in multiple windows, which unfortunately cannot guarantee that the animation happens at the same time in all windows.


I have tried recording an image both via win.webContents.on('paint', (event, dirty, image) => { … }) and win.webContents.beginFrameSubscription(false, (frameBuffer, dirtyRect) => { … }) While this works, the problem is getting the info into another window in a timely manner. IPC and rendering to a canvas fullscreen canvas top out at about 15fps, while the original content happily sits at 60fps. Even using the dirtyRect information doesn’t change this.