I dont know that this question really makes sense to be honest, since modeling speaker wire as a transmission line is way more work than needed. Looking over the numbers, a 3dB loss (1/2 Power) would require a wire length of about 1600 feet, so knowing the dB relationship is pretty pointless.
The R of the speaker = 8 ohms (though this isnt truly valid as its impedance changes over frequency, but for simplicity its a perfect 8 ohm resistor)
The R of the wire = 2.525mOhm per foot according to wikipedia (Im lazy sue me), or .00252 ohms per foot, and you need to double that since every foot of wire conducts both directions.
Assuming an audio receiver than transmits 100 Watts perfectly to a speaker regardless of load impedance (I would figure most are Voltage limited not power limited in reality, but for simplicity constant power is nice)
With this case your power to the speaker comes down to a simple voltage divider where Speaker power % =100 * 8/(8+(2* .00252 * feet)). So 1000 feet of speaker wire = 61.3% Power to the speaker, or in decibels, you have a loss of -2.12dB. Using decibels in this case really tells you almost nothing IMO, and if you have a 1000 foot run of speaker wire, you might want to rethink your setup.