DVI will convert directly to HDMI if you have DVI-D on that output (any modern card should--check
this wiki article and scroll down to "Connector").
A quick summary of DVI so you don't have to read that whole article: DVI *CAN* carry analog & digital signals, so that when it first debuted it was backwards compatible with what we would commonly refer to as "VGA" in terms of signalling.
HDMI, being digital, requires DVI-D support on the connector/card, and converting from DVI to HDMI is a simple pass-through and doesn't require an extra buffer chip as converting from VGA to DVI-D would, or as converting from Displayport to any other protocol (because the digital protocol is not compatible with HDMI/DVI or any analog output).
HDMI continues to evolve well past the DVI specifications, as does Displayport. So if you choose to go from DVI directly to an HDMI monitor, make sure the refresh rate & resolution will be supported by the HDMI output. 1080p@60hz will never be a problem, and 1440p will work as well.
--------------
tl;dnr info:
VGA was typically capable of up to 2048x1536 or 2304x1440 in the CRT era (the maximum I typically saw at least), and the Matrox DualHead2Go family of products will accept up to 3840x1200 resolution over VGA, then split it out over two or three monitors. The refresh rate is slightly reduced from 60Hz to 57Hz - but it shows that VGA has a fairly high maximum bandwidth, more than single-link DVI. The primary limitation here is the DAC (digital to analog converter) which basically scans the "output buffer" on the digital side and converts that to the analog signal. DAC's typically ran 300Mhz & under, though high end cards & monitors supported 2048x1536px @85 Hz (388 MHz). With DVI's initial implementation, (Single) WUXGA (1,920 × 1,200) @ 60 Hz (Dual) was limited by copper bandwidth limitations, DVI source limitations, and DVI sink limitations.
Dual-Link DVI enabled higher resolutions and refresh rates (2560x1600p @ 60hz is the limit for DVI-D in most cases), but still comes in well under current HDMI specifications, which continue to be extended. In the future USB-C & Displayport are probably going to supplant everything until we're either carrying the signals directly over the network (via ethernet or wifi) or optical interconnects (keeping in mind that Thunderbolt was originally planned to be optical and had copper interconnects added to the specification for cost reasons and compatibility with USB...which can ALSO now be done optically as well).
--------------
To answer your question directly: This is dependant on your BIOS/EFI settings, and the one attached to the CPU will likely need to be there on boot to be detected. Do you have any video encoding software? (Handbrake is free if not). Use that to check and see if Quicksync is enabled, if so you should also be able to access the built-in GPU.
However, as I was supporting above, going from DVI to HDMI is a relatively simple affair and you should be able to do so with a small dongle, as long as you're working within the bounds of the DVI spec.