Go Back   DisplayLink Forum > DisplayLink Graphics Technology > Linux and Open Source
Register FAQ Calendar Today's Posts Search

 
 
Thread Tools Search this Thread Display Modes
Prev Previous Post   Next Post Next
Old 04-20-2018, 02:29 PM   #4
JB_Harris
Junior Member
 
Join Date: Apr 2018
Posts: 1
Default CPU acting as GPU

I think this will always be a problem when you're doing video rendering in software vs on hardware (GPU). I'm not an expert, but I'm fairly certain that all of the frame rendering, window treatments & management, shadows, transparency, positioning, etc... on these DisplayLink devices has to be done by the HOST PC's CPU and then transferred as raw frames/bitmaps to the monitor....I don't think anything can be offloaded to the screen to handle since the interface just doesn't understand it.

It reminds me of the old "WinModems" from back in the late 90s. They were modems where all of the modulation & signal processing was done in the CPU instead of onboard microprocessors. This made them very cheap, but it meant they weren't compatible with other operating systems, and it meant that you had to have a relatively fast CPU in order to handle higher speeds (like 56k).

Trade offs.
JB_Harris is offline   Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 01:47 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2026, vBulletin Solutions, Inc.