DDC
DVI borrowed the I2C for the DDC lines, and then the HDMI carried it over. This was a short sighted choice of protocol because they did not foresee the DVI and HDMI would be used way beyond the simple system of one DVD player to one TV. Although both relaxed the max DDC capacitance to 700 pF because of the relatively low data rate, it is still way too limited for a pro AV system. The typical DDC capacitance of a 7.5 m (25’) cable is already more than 700 pF; each electronics input and output would add 50 pF or more. It becomes worse when we use HDMI over Cat5 cables for longer distance transmissions. The typical Cat5e cable has an equivalent DDC capacitance of about 50 pF per meter. A 100 m (330’) long cable has a capacitance of 5000 pF! In short, the majority of the systems used in the pro AV industry do not meet I2C DDC capacitance specs!
When the DDC capacitance is bigger than the specs, the timing would be wrong for the devices to know when the line is busy or free and whether the data is received correctly by the receiver or not. This would cause all kinds of data corruptions and the involved devices may not even know it.
Generating an HDMI or DVI signal from a general microcontroller
In theory there is nothing stopping you generating a video signal for an HDMI or DVI monitor relatively simply by manually bit bashing the 4 TMDS lines (CLK+-, D0+-, D1+- and D2+-) and it would be rather a cool thing to do for many embedded applications. However the catch is the speed requirement of the signal. Whilst you don't need to operate at the full speed the interfaces can run at, both are specified with a minimum clock speed of 25MHz. This is the clock speed, the 3 data lines need to operate at 10x this speed as there are 10 data bits per clock cycle, so you would need a microcontroller which could provide a steady 250MHz bit stream via the HDMI or DVI interface.