• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Factorisation in CPU's

Harabecw

Senior member
Hello!
I've read (in qubit.com, heh) about how multiplication works a lot faster than the opposite - factorisation.

I was interested in knowing how modern CPU's do factorisation, as I've had an idea about a way to make it faster.
However, I don't know how well CPU's do this so I wanted to ask first 😉
 
Factorisation of two primes can onl be done via brute force currently. There might be some other way to do it but the only people to know would be the NSA. I suspect normal factorisation would just be an extension to n primes.
 
Not only does factorization take a long time, but also division takes a lot longer than a multiplication. Basically in division, we have to do many multiplications, until it converges.
 
Originally posted by: Harabecw
is there a program I can use to test different methods of factorisation and how well they perform?

i wonder why u so interested in the timing.
i dont think it will be significance, specially with CPU speed > 2G .
 
Factoring numbers quickly is very useful. If he does in fact have a better way to do it, he should be enouraged 🙂. Even with a 10GHz CPU, factoring primes faster is good. Besides, if you could factor primes fast enough, you could steal all the money in the world 😉.

I'd say write a C program that implements various algorithms, and just time it.
 
:Q :Q :Q
I'm ashamed to admit I'm not too familiar with C 🙂
Are you? I've got my idea written out exactly if you're up for 5 minutes of coding hehe.
 
Well, if it IS a good idea, then I would be leery of telling anyone until you had a decent patent on it. On the other hand, its pretty hard to find out if it's a good idea or not until you have implemented yet. I guess your between a rock and a hard place. The safe thing to do would be to learn C, the easy thing would be to just post it and to hell with the fame and glory. Im fairly sure that anything you would have discovered would have already been discovered by either the large public academic cryto community or by the many secret service organisations around the world but you never know.
 
By factorization I assume you mean division?

The computer does it internally the same way you would do by hand, except in binary instead of decimal. Actually being in binary makes it a bit simpler.... Each digit of the result is determined by a comparison; if the number you are dividing by is less than what you are dividing into the digit is a 1, otherwise a 0, and the subtraction to determine what to compare/divide into for the next digit happens only when you just had a 1 digit.
 
Originally posted by: Harabecw
:Q :Q :Q
I'm ashamed to admit I'm not too familiar with C 🙂
Are you? I've got my idea written out exactly if you're up for 5 minutes of coding hehe.

Anything worth writing takes more than 5 minutes 😉. If you know any programming language at all, you could still probably write it.
 
Originally posted by: CTho9305
Originally posted by: Harabecw
:Q :Q :Q
I'm ashamed to admit I'm not too familiar with C 🙂
Are you? I've got my idea written out exactly if you're up for 5 minutes of coding hehe.
Anything worth writing takes more than 5 minutes 😉. If you know any programming language at all, you could still probably write it.
Well, it might take a few years to execute if you did it in VB, but I digress... 😉

<-- is reminded of the time he tried to implement an image blending (with alpha channels) algorithm in VB; damn that was slow. :disgust:
 
For normal division, I found this great book called "Hacker's Delight" that could help you.

In terms of primes, check out The Prime Page. I got a factorizing program off there that uses about six different methods.
 
Actually to see the division algorithm, look at a 6502 assembly guide.

Those old processors did not have multiply and divide instructions so it was done in software. The algorithm is the same I'm sure and is a common textbook example since it has such obvious practical use.
 
Back
Top