Define 1..

Pretender

Banned
Mar 14, 2000
7,192
0
0
While the voices in my head were telling me to come here and nef, one voice in particular gave me something to think about. What is 1?

Think about it: all the numbers in all the sets of numbers that we define (integers, rationals, etc) can be defined in terms of 1. 2 is 1+1. 3 is 1+1+1, or 2+1. 4 is 1+1+1+1, or 3+1, etc. But what is 1? The number that we use to define all the other numbers we know, yet we cannot define it. How do we know it even exists, how can we define it, how can we prove it? Does anyone else find it strange that mathematics, a language defined by it's logic and ability to work through to the thorough use of proofs based on factual information, is all based on a number, whose which we haven't even proven.


Sigh, this is the stuff that the voices in my head tell me when I'm bored.
 

xtreme2k

Diamond Member
Jun 3, 2000
3,078
0
0
I think 1 is sometimes called unity, the basic building blocks of numbers.

By the way, 0 cannot be defined as terms of 1 :)
 

Pretender

Banned
Mar 14, 2000
7,192
0
0
0 = 1-1.

I'd like to see you define 1 without using 1, and without using numbers defined by 1. It's impossible.
 

xtreme2k

Diamond Member
Jun 3, 2000
3,078
0
0
0 is the absence of a value,

your way of 'defining' 0 = 1 - 1 is incorrect in my opinion.

how do you define root(2) without using root(2) in the defination?
 

hendon

Senior member
Oct 9, 2000
373
0
0
1 is the number which when divided by itself, gives itself...
ie.
a/a = a
has the only solution 1.
 

Entity

Lifer
Oct 11, 1999
10,090
0
0


<< I'd like to see you define 1 without using 1, and without using numbers defined by 1. It's impossible. >>



Pretender,

No more impossible than you insisting that 1 is the root of everything.

2 = 2 - 2/2

There: one as defined by the number two, without using 1 as a building block.

In the same example, you could say that 1 = .5+.5, or you could say that .5= 1 / 2; essentially, it's all a matter of perspective. :)

Rob
 

Pretender

Banned
Mar 14, 2000
7,192
0
0
Hmm, good point xtreme....I guess the only way to define root(2) would be:
root(2) * root(2) = 2
but then to get root(2) by itself, it'd become
root(2) = 2/root(2), and that falls into the category of defining something using itself.

And, it seems, hendon has solved my little question. :)
 

Pretender

Banned
Mar 14, 2000
7,192
0
0


<< 1 = 2 - 2/2

There: one as defined by the number two, without using 1 as a building block.
>>


But how would you define 2? By 6/3? Then how would you define 6 and 3? In terms of higher numbers (i.e. 72/12 and 12/4)? The only difference here is that you'd have a problem defining the largest number instead of the smallest. Unforuntately I know little about the definition of infinity and dealing with that stuff yet, so my argument falls into pieces here. It's been fun arguing, though :)
 

xtreme2k

Diamond Member
Jun 3, 2000
3,078
0
0
1 = 2 - 2/2

but 2 = 1+1

whatever you make 2 into, say 6/3 or 8/4, 6 is 1+1+1+1+1+1 so it still cannot be defined without 1.
 

Hanpan

Diamond Member
Aug 17, 2000
4,812
0
0
One is the only singular numerical entity. That is it is the only natural number that consists only of itself. For a harder problem try to define zero or x/0.
 

Entity

Lifer
Oct 11, 1999
10,090
0
0


<< 1 = 2 - 2/2

but 2 = 1+1

whatever you make 2 into, say 6/3 or 8/4, 6 is 1+1+1+1+1+1 so it still cannot be defined without 1.
>>



The problem with that is this; 1 is only defined in terms of smaller numbers, according to that. Meaning: you can't define 2 without 1; yet you can't define 1 without .01, if that makes sense. .01+.01 (you get the point) will eventually = 1. Thus, 1 is a component of smaller numbers, ad infinitum. Either way, 1 is not the root of itself, unless you only consider natural numbers - consequently limiting your focus.

Rob
 

gnognugs

Banned
Feb 17, 2001
810
1
0
This is the first calculation I ever learned.

5318008

Just about as vile as it getz around here:Q
 

Pretender

Banned
Mar 14, 2000
7,192
0
0
You can define smaller numbers in terms of 1.

.1 = 1/1+1+1+1+1+1+1+1+1+1 (1/10)
Just one over many ones.

Thus you could define .1 in terms of 1, and consequently define 1 in terms of .1, which in turn was defined in terms of 1. Yet the only way to define .1 would be either in terms of larger numbers (1000/10000), or the &quot;smallest&quot; (by smallest, I mean magnitude - hence -5 is greater than 1) number using 1's, like I did above.
 

Thom

Platinum Member
Oct 18, 1999
2,364
0
0
surely one is defined as on just as 0 is defined as off.

Unity as well. trust me, there will be an official mathematical definition somewhere. mathematicians are that sad. :)
 

Mday

Lifer
Oct 14, 1999
18,647
1
81
multiplicative identity

the generator of all the positive integers by addition

unity

a unit in the reals, integers, complex numbers

the largest digit in binary

i^4
 

Wagner

Banned
Aug 11, 2000
88
0
0
In any logical system or mathematical theory, you need to start out with a set of predefined sumbols and a set of predefined axioms.

The simplest version of what we call math, or arithmetic, defines the zero symbol as well as the increment symbol, along with some other symbols. So we have:

0 == 0
0+ == 1
0++ == 2
0+++ == 3

and so on and so forth.

Our notion of one, or unity, is just a way of simplifying or representing the act of counting, which is essential to the practice of arithmetic. In the above symbology, no reference to one is made, but you can clearly see how the act of counting has been symbolized. (Through the use of the increment operator.)

So 1 is nothing. It's just a symbol that serves to symbolize some idea that can be represented in many other ways.
 

Mountain

Senior member
Dec 7, 2000
326
0
0
the amount your self is. another is a different self, hence another &quot;one&quot; .
 

Fingers

Platinum Member
Sep 4, 2000
2,188
0
0


<< 1 is the number which when divided by itself, gives itself... >>



so what your saying is if I have 1000000, which is 1 million then that = 1 because 1000000/1000000 = 1

your definition covers an infinate amount of numbers.