Logo by MandelBRO - Contribute your own Logo!

END OF AN ERA, FRACTALFORUMS.COM IS CONTINUED ON FRACTALFORUMS.ORG

it was a great time but no longer maintainable by c.Kleinhuis contact him for any data retrieval,
thanks and see you perhaps in 10 years again

this forum will stay online for reference
News: Follow us on Twitter
 
*
Welcome, Guest. Please login or register. March 28, 2024, 03:14:58 PM


Login with username, password and session length


The All New FractalForums is now in Public Beta Testing! Visit FractalForums.org and check it out!


Pages: [1] 2   Go Down
  Print  
Share this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on RedditShare this topic on StumbleUponShare this topic on Twitter
Author Topic: hausdorff dimension of PI ???!?!  (Read 3373 times)
0 Members and 1 Guest are viewing this topic.
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 7044


formerly known as 'Trifox'


WWW
« on: January 07, 2013, 02:56:51 AM »

hey dudes,

a friend of mine has asked me a question that i couldnt answer directly....  evil

how would you calculate the hausdorf dimension of a number sequence generated by e.g. PI ?!
i know it is just a number sequence of numbers between 0..9 (inclusive) but, hausdorrff dimension can be applied
to one dimensional objects as well, and so, i would be interested in determining the hausdorff dimension of
common numbers, e.g. sqrt(2) e and so on ... how would you approach this ?!?!?

 huh? huh? huh? huh? huh? huh? huh?
« Last Edit: January 07, 2013, 03:14:19 AM by cKleinhuis » Logged

---

divide and conquer - iterate and rule - chaos is No random!
asimes
Fractal Lover
**
Posts: 212



asimes
WWW
« Reply #1 on: January 07, 2013, 06:03:10 AM »

I may be wrong as you guys here probably know a lot more about Hausdorff Dimensions than me, but shouldn't it just be one? It seems like it is just an infinite series of numbers on the positive real line.

The fourth example from this link has an entry for numbers (real numbers with even digits between 0 and 1), maybe that will help: http://en.wikipedia.org/wiki/List_of_fractals_by_Hausdorff_dimension
Logged
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 7044


formerly known as 'Trifox'


WWW
« Reply #2 on: January 07, 2013, 06:21:34 AM »

it should be one for an equal sequence, e.g. 3.333333333333333333333333333333
with just 3s it should be one ... sure, but since it jumps up and down, the hausdorff dimension should say something of the wildernes of such a number...
Logged

---

divide and conquer - iterate and rule - chaos is No random!
fractower
Iterator
*
Posts: 173


« Reply #3 on: January 07, 2013, 06:32:33 AM »

I think you are on the right track. It might be interesting to replace space filled with Shannon entropy.

http://en.wikipedia.org/wiki/Information_theory

In computer science the entropy is usually measured in bits, but since you are interested in the decimal representation it will be useful to define the entropy in units of digits.

 H(N) = -\sum_{x \in \mathbb{N}} p(x) \log_{10} p(x)

Where H(N) is the entropy for N digits and p(x) is the probability of a particular string of N digits. In this case the Hausdorff/Shannon dimension would be defined by

H(N) = N^{d}

as N approaches infinity.

For example if the digits were completely random then

H(N) = N

and d=1.

With this definition 10/3 would have a H/S dimension of d=0. This is because additional digits have a zero surprise factor.
Logged
fractower
Iterator
*
Posts: 173


« Reply #4 on: January 07, 2013, 06:40:01 AM »

Oh. I forgot to mention. To the best of my knowledge \pi,e,\sqrt{2} all appear to to have a dimension of 1 (completely random). However I do now know if it has been proved.
Logged
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 7044


formerly known as 'Trifox'


WWW
« Reply #5 on: January 07, 2013, 06:43:41 AM »

ok, so i could write a program to measure it to a certain extent

this brings up another question: is the representation relevant ?!?!?!? i mean, when pi is displayed as a bit-string, or calculated using another base, e.g. 2,3,4,5,6, and so on, could be said that this is comparable to measurement - my first guess would be that this would not change the result.... huh?
Logged

---

divide and conquer - iterate and rule - chaos is No random!
Tglad
Fractal Molossus
**
Posts: 703


WWW
« Reply #6 on: January 07, 2013, 06:59:34 AM »

The point at PI (and any other point) has dimension 0.
If you mean the sequence of digits, I don't see how they could have a dimension.

If it helps PI is (almost definitely) a 'normal' number, meaning the sequence of digits is the same as a random distribution, no digit is more likely than any other, regardless of the base used.
Logged
fractower
Iterator
*
Posts: 173


« Reply #7 on: January 07, 2013, 07:21:00 AM »

Unfortunately measuring Shannon entropy is probably one of the "hard" problems. There are random number checkers out there that provide a measure of cryptographic goodness. Another easy method is to write a binary representation of the number to a file then try compressing it.

 I cannot say for sure but I suspect all rational bases will give the same dimension. 10/3 in base 10 is 3.333333, but in base 3 would be 10.100000. In both cases d=0. Irrational bases are a different story however. For example \pi in base \pi is just 1. Somehow I think that probably had negative information content.  cry
Logged
hobold
Fractal Bachius
*
Posts: 573


« Reply #8 on: January 07, 2013, 07:31:05 AM »

Unfortunately measuring Shannon entropy is probably one of the "hard" problems.
It's even harder to measure algorithmic complexity, i.e. information content in the sense of Chaitin: the shortest program that can produce a particular sequence of bits. The number pi is computable by compact algorithms, so it contains way less information than a truly random sequence of bits.

Back to the original question, though. Hausdorff dimension is some measure of a set of points that are embedded in a surrounding space. In order for the question of pi's Hausdorff dimension to make sense, you'd have to specify the set of points and the surrounding space. How does one get from pi to some geometrical object?
Logged
fractower
Iterator
*
Posts: 173


« Reply #9 on: January 07, 2013, 07:45:58 AM »

Quote
Chaitin: the shortest program that can produce a particular sequence of bits.

I like that one, but to be fair I think you have to include data storage as well.
Logged
hobold
Fractal Bachius
*
Posts: 573


« Reply #10 on: January 07, 2013, 08:56:24 AM »

Algorithmic complexity does include all library routines and all look up tables, i.e. all the direct and indirect input bits required to produce a program's output bits. No cheating allowed. smiley
Logged
kram1032
Fractal Senior
******
Posts: 1863


« Reply #11 on: January 07, 2013, 09:17:30 AM »

How did you get from the Hausdorff dimension to Shannon Entropy?

I could think of a way to define a pi-set:
Simply interpret \pi as an image with pixel values between 0-9.
On that infinitely long image, do some typical approximate box-counting-style algorithm.

You could try the very same with the binary sequence or, if you fancy that, even base-e which, in a way, would be "the natural" base in a mathematical sense or the golden phinary.

Or a different method might be to make a diagram of pi like this:
http://mathworld.wolfram.com/PiContinuedFraction.html
« Last Edit: January 07, 2013, 09:20:07 AM by kram1032 » Logged
fractower
Iterator
*
Posts: 173


« Reply #12 on: January 07, 2013, 05:23:22 PM »


How did you get from the Hausdorff dimension to Shannon Entropy?

I guess by convention the Hausdorff dimension only applies to spatial contexts, but the rescale and renormalise operation can be applied to more abstract metrics. Entropy seemed like a good metric since the original question related to the decimal representation of common irrational numbers.

Logged
kram1032
Fractal Senior
******
Posts: 1863


« Reply #13 on: January 08, 2013, 09:36:02 AM »

I wasn't aware that Shannon Entropy can act as a metric...

Let's see. To be a metric, you need to fullfill the following things:
  • Symmetry in your input (d(x,y)=d(y,x))
  • non-negativity (d(x,y)>=0 | \forall x,y)
  • d(x,y)=0 <=> x=y
  • The triangle inequality (*) (d(x,z) \leq d(x,y)+d(y,z))

(*) The triangle inequality simply states that there is no shorter path between two points than a straight line (euclidean) or more generally, the shortest path between two points always lies on a geodesic.

Shannon entropy clearly is symmetric.
It also is non-negative.
However, the other two conditions really only make sense with two arguments.

With only two arguments, I can only get one case of Shannon entropy that is equal to 0 though:
0 Log (0) + 1 Log (1) = 0

The first input here is inequal to the second input.
This violates the third point.
Shannon Entropy cannot be a metric, so you can't use it for a generalized evaluation of Hausdorff-Dimension.
However, there are some metric-like concepts in probability theory that might be usable.

The distance correlation correlates distances of parts of a set and, iirc, acts more like a metric.
(I don't have the time to read/test that right now though. I work off the top of my head from back when I discovered that concept.

Furthermore, distance correlation is another way to measure the information in a system, just like Shannon entropy is.

On an unrelated note: the Mathematica module of this forum apparently doesn't support \Leftrightarrow or \iff which are both ways to do <=>
« Last Edit: January 08, 2013, 12:37:20 PM by kram1032 » Logged
fractower
Iterator
*
Posts: 173


« Reply #14 on: January 08, 2013, 10:36:06 PM »

I wasn't aware that Shannon Entropy can act as a metric...

Let's see. To be a metric, you need to fullfill the following things:
  • Symmetry in your input (<Quoted Image Removed>)
  • non-negativity (<Quoted Image Removed>)
  • <Quoted Image Removed>
  • The triangle inequality (*) (<Quoted Image Removed>)


Ouch! I guess I need to be more careful about terms. I was using metric in the colloquial sense. As in something that can be measured. Though you bring up a good point. It turns out you can construct a metric that meets your definition.

D(X,Y) = H(X,Y) - I(X:Y)

Where H(X,Y) is the Joint Entropy (All the information given X and Y) and I(X:Y) is the Mutual Information (All the information common to X and Y). Or stated another way.

D(X,Y) = H(X|Y) + H(Y|X)

All the information in Y that is not in X plus all the information in X that is not in Y.
Logged
Pages: [1] 2   Go Down
  Print  
 
Jump to:  

Related Topics
Subject Started by Replies Views Last post
Hausdorff duminsion=fractal dimension? General Discussion luuc 6 3921 Last post March 26, 2007, 05:48:56 AM
by himalayanjava
IFS and Minkowski Dimension IFS - Iterated Function Systems Crucifixio 3 3824 Last post July 08, 2010, 05:31:03 PM
by paxinum
Hausdorff dimension of a function General Discussion Timeroot 7 5308 Last post February 14, 2010, 11:23:03 AM
by kram1032
Hausdorff dimension of the Mandelbulb Theory « 1 2 3 » Prokofiev 33 19637 Last post September 05, 2010, 12:38:08 AM
by M Benesi
Hausdorff of hypecomplex J-M sets (new) Theories & Research snayperx 2 435 Last post March 30, 2014, 04:38:19 PM
by Endemyon

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Valid XHTML 1.0! Valid CSS! Dilber MC Theme by HarzeM
Page created in 0.202 seconds with 24 queries. (Pretty URLs adds 0.011s, 2q)