- Sep 17, 2002
- 14,582
- 162
- 106
I feel like a collossal idiot. It has been a very long time since I was in a math class doing simple Algebra and I am drawing a complete blank on how to go about solving this problem. I developed an equation that produces the correct answer every time:
[ B/(1-A) ] + [ D/(1-C) ]
The problem is that this does not fit into my algorithm very well (for reasons that would take too long to explain). I came up with a separate equation that produces almost the same value each time (the discrepency always seems to show up after the 3rd or 4th decimal place). The new equation is this:
[ 1/(1-AxB) ] x [1/(1-CxD) ]
While the numerical examples always come really, really close, I wanted to prove that they were the same mathmatically before I use the second equation in my algorithm. This is where I forget how to do this. Where do I start? At first I set the to one equal to 0 and tried to solve for the second one...but I think I am making a huge blunder there...I don't believe I can do that. If anyone can give me a hint in the right direction, it would greatly appreciated.
I feel sooo stupid right now!
[ B/(1-A) ] + [ D/(1-C) ]
The problem is that this does not fit into my algorithm very well (for reasons that would take too long to explain). I came up with a separate equation that produces almost the same value each time (the discrepency always seems to show up after the 3rd or 4th decimal place). The new equation is this:
[ 1/(1-AxB) ] x [1/(1-CxD) ]
While the numerical examples always come really, really close, I wanted to prove that they were the same mathmatically before I use the second equation in my algorithm. This is where I forget how to do this. Where do I start? At first I set the to one equal to 0 and tried to solve for the second one...but I think I am making a huge blunder there...I don't believe I can do that. If anyone can give me a hint in the right direction, it would greatly appreciated.
I feel sooo stupid right now!