After all these years Excel STILL does bad math.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
You need to learn how to format cells buddy.
Thats been standard since the DOS spreadsheet, (whatever they called it).

What I'd like to see is some high-end graphic software to replace my graphic calculator. Excel kinda sorta does it already but I'd like the ability to go into a more calculator type interface. y = mx + b and such.

matlab, wolfram alpha
 

Ns1

No Lifer
Jun 17, 2001
55,420
1,600
126
Plugged it into my Excel sheet

=1554.93-1154.66
400.27


OP fail
 

AeroEngy

Senior member
Mar 16, 2006
356
0
0
If you have an 80MB Excel book, you are doing it wrong. WAY wrong.

If I found someone in my company trying to use an 80MB excel book, I'd nix that shit right quick, and I'm not even the IT guy.

I had a professor at GWU give me a 109MB spreadsheet that he used to calculate probability of system failure given probably of componet failures and a system diagram. It was insanity and could have easily been done in matlab or even as a macro in excel and only been a couple 100k. He was even Dean of the School of Engineering & Applied Sciences for a little while ...
 

Train

Lifer
Jun 22, 2000
13,587
82
91
www.bing.com
I had a professor at GWU give me a 109MB spreadsheet that he used to calculate probability of system failure given probably of componet failures and a system diagram. It was insanity and could have easily been done in matlab or even as a macro in excel and only been a couple 100k. He was even Dean of the School of Engineering & Applied Sciences for a little while ...

When your only tool is a hammer...
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
I had a professor at GWU give me a 109MB spreadsheet that he used to calculate probability of system failure given probably of componet failures and a system diagram. It was insanity and could have easily been done in matlab or even as a macro in excel and only been a couple 100k. He was even Dean of the School of Engineering & Applied Sciences for a little while ...

wow. was there a lot of data in that thing? i will never try to code anything in excel.
 

Dulanic

Diamond Member
Oct 27, 2000
9,968
592
136
Excel is extremely versatile and can be invaluable for many sections of a business. Yeah some people can flex their epeen about how it can be done x, y, z way. However, that is not resonable for most non programmers where as Excel can be used by just about anyone.
 

Train

Lifer
Jun 22, 2000
13,587
82
91
www.bing.com
Excel is extremely versatile and can be invaluable for many sections of a business. Yeah some people can flex their epeen about how it can be done x, y, z way. However, that is not resonable for most non programmers where as Excel can be used by just about anyone.

That's not the point

80MB that takes 30 mins to open is just plain dumb. ANY program that takes that long to open a doc, even on a slow system, has far exceeded the intended purpose of the application.

Excel can still be used as a front end (for the every day user) and open right away, and still have access to much more than 80MB worth of data.
 

mb

Lifer
Jun 27, 2004
10,233
2
71
That's not the point

80MB that takes 30 mins to open is just plain dumb. ANY program that takes that long to open a doc, even on a slow system, has far exceeded the intended purpose of the application.

Excel can still be used as a front end (for the every day user) and open right away, and still have access to much more than 80MB worth of data.
My work computer is a POS and it opens 100MB files (budgets) no problem. He said part of it was due to their network as well. So don't get all uppity because one person said it is slow.
 

Train

Lifer
Jun 22, 2000
13,587
82
91
www.bing.com
My work computer is a POS and it opens 100MB files (budgets) no problem. He said part of it was due to their network as well. So don't get all uppity because one person said it is slow.

Sorry if you take me as being uppity. I'll add that if someone in my office was a using a 100MB excel file... even if it opened quickly, I would probably remedy the data store situation.
 

SparkyJJO

Lifer
May 16, 2002
13,357
7
81
Odd because when I put in 1,554.93 - 1,154.66 it gives me the right answer :hmm:
Plugged it into my Excel sheet

=1554.93-1154.66
400.27


OP fail

It's random. That is one equation it did it to me on, but you're right, if I copy that equation to another set of cells it works properly o_O

Maybe it has something to do with the long continuous summing equations.
 

Dulanic

Diamond Member
Oct 27, 2000
9,968
592
136
My work computer is a POS and it opens 100MB files (budgets) no problem. He said part of it was due to their network as well. So don't get all uppity because one person said it is slow.

I'm with you. I use 50mb files with tons of formulas and it might take 30 seconds. I am guessing he has liked files linked to other files with slow access. Though I don't know that he got upity.
 
Last edited:

CLite

Golden Member
Dec 6, 2005
1,726
7
76
I'm with you. I use 50mb files with tons of formulas and it might take 30 seconds. I am guessing he has liked files linked to other files with slow access.

+1, I've worked with 90MB files that had enormous lookup tables, loaded in 10's of seconds and once loaded it ran blazingly fast. We had somone replace a lot of the table operations via programing to reduce size to a few MB, now it runs slower :(.
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
I've noticed huge increases in speed with an SSD when working with large excel spreadsheets.
 

Ninjahedge

Diamond Member
Mar 2, 2005
4,149
1
91
Plugged it into my Excel sheet

=1554.93-1154.66
400.27


OP fail

Ditto.

Most of the time rounding errors come in when you have formatted cells that are more than they appear.

If you see 1554.93 it may actually BE 1554.9299999999999~ and unless you use functions like ROUND you will get this carried over into all the other calcs.

This is not that big a deal until you get a chart that, because of formatting, says something like:

400 - 110 - 56 - 10 = 223

You submit that to a client and they just end up scratching their head.
 

Bignate603

Lifer
Sep 5, 2000
13,897
1
0
If you have an 80MB Excel book, you are doing it wrong. WAY wrong.

If I found someone in my company trying to use an 80MB excel book, I'd nix that shit right quick, and I'm not even the IT guy.

How else do you suggest I go through hours of data taken with multiple scans per second? I tend not to work with the whole file at once but I still need to load it in to excel so I can reduce it down.
 

AeroEngy

Senior member
Mar 16, 2006
356
0
0
wow. was there a lot of data in that thing? i will never try to code anything in excel.

You input the conditions into the first sheet and he built support for doing all the matrix math into a second sheet. The sheet was pre-setup for some huge number of possible components filling almost an entire sheet with equations. However, when you were modeling say 10-15 components over 99% of the sheet was filled with zeros and was useless. It could have easily been a macro that took the number of components and setup the matrices correctly without 30,000 rows by 200 columns of equations all computing zero.
 

Ninjahedge

Diamond Member
Mar 2, 2005
4,149
1
91
Text files are not necessarily that big 603.....

You have to look at what you are actually storing and how many times you are repeating the same info. Sometimes something as simple as consolodating a set of calculations (or doing Vlookup features rather than doing EVERY point) will save a lot of space....


In your defense, I have had similar large sheets calculating the long term deflection of concrete high-rise columns (you need a datapoint for each time step and each construction phase for each portion of the column.....it gets rather large rather fast.)
 

Ninjahedge

Diamond Member
Mar 2, 2005
4,149
1
91
Aero, not all of us know enough VBA to do that!!! ;)

(Just when I started getting good at macro script.....)
 

Train

Lifer
Jun 22, 2000
13,587
82
91
www.bing.com
How else do you suggest I go through hours of data taken with multiple scans per second?.

With a tool designed for exactly that, aka Excel's big brother... Access. Or my personal preference, SQL Express because it runs great on a lappy, is totally free and I'm more comfortable with the all the querying tools. Or Full Blown SQL Server or Oracle is you really need something hefty.

Wether you use Access, or some other database, you can still use Excel for the front end work. I've done apps that just dump subsets of data into Excel so the user can play with it. Don't get me wrong, I LOVE Excel, it's unbeatable at desktop slicing and dicing, I use it every day. But it's not a data repository.

I tend not to work with the whole file at once but I still need to load it in to excel so I can reduce it down.

The app I wrote at my last place kept about 80 Gigs in SQL Server, but allowed users to define a few filters, and spit out the result to Excel to let the user play with it from there. By default it only returned 500 rows, but would let you override if you really needed more.