alphatarget1
Diamond Member
This might be really a n00b question but nevertheless... I'm trying to figure this out.
Let's say I have an array of data:
1 100 90
2 50 90
3 10 90
4 0.5 90
5 0.125 90
6 0.0001 90
7 0.0000001 90
Basically I'm trying to program something so that I can use the slope function to determine the slope of the 2nd column. and have the program select the range that the line best fits. The slope converges to 0...
I set up a cell that does the slope calculation, and I can tell it to iterate until the slope matches a certain precision. What I don't know how to do is how to tell the program to look at a certain range when the 3rd column equals a certain value, and how to get VBA to automatically go down a row when the precision is not met...
any ideas?
Let's say I have an array of data:
1 100 90
2 50 90
3 10 90
4 0.5 90
5 0.125 90
6 0.0001 90
7 0.0000001 90
Basically I'm trying to program something so that I can use the slope function to determine the slope of the 2nd column. and have the program select the range that the line best fits. The slope converges to 0...
I set up a cell that does the slope calculation, and I can tell it to iterate until the slope matches a certain precision. What I don't know how to do is how to tell the program to look at a certain range when the 3rd column equals a certain value, and how to get VBA to automatically go down a row when the precision is not met...
any ideas?