Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

PC monitor (Analog D-15 connector) as a TV

Status
Not open for further replies.

NJ Roadmap

New Member
What kind of circuit would I be looking at? I'm sure these things are available in the market but just wondering how it's done. I would obviously need a TV tuner of some sort (freeview or sky). The device which can convert analog video into RGB signals for input to the monitor is what I'm interested in. How is it done?

The other issue is if I was using Sky+ which had a HDMI output..there probably isn't any way you can convert HDMI signals into analog RGB or can you?

Why I'm asking is because I'm replacing my 19" Acer widescreen (1440x900) with a 24" Dell widescreen soon so am thinking of using the 19" widescreen in the kitchen as a TV (I dont want a dual-monitor setup..this 24" is big enough and has a 1920x1200 resolution!)
 
TV and monitors use completely different scan frequencies - in order to do it you need to first decode the composite signal, digitise it into memory, then read it out at a differnt rate - oh, and also scale it up to match the screen resolution. It's far from a trivial project, and you can buy units that do it - but the picture quality is usually pretty poor anyway.

A far better idea is to sell the monitor, and buy a TV - the picture will be FAR better, and will probably be cheaper as well.
 
Early CGA and EGA resolutions weren't much higher than TV signals. Is it really that hard? I'm guessing it would be more limited to the max/min horizontal/vertical refresh rates the monitor supported. Perhaps simple frequency doubling circuits would work for TV to VGA resolution sort of a poor mans interlacing? If you can decode the TV signal into Horizontal, Veritcal R G and B signals there is a decent chance depending on your monitor that you'd be able to get it to work. It'd just have to be modern enough to have 'fine tuning controls'
 
Last edited:
I suggest you try looking into how TV signals work - you can't just 'double' a video signal!.

Normal VGA is about double TV standard frequencies, and it's always been EXTREMELY rare that even multi-sync monitors will go anywhere near that low. A good place to look is Amiga sites, because the later Amiga's added VGA type outputs, as well as normal TV ones - so it was essential to have a monitor which coped with both - even back then they were as rare as hen's teeth!.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top