Home > Net >  why python stops calculating more decimal places after 14 digits?
why python stops calculating more decimal places after 14 digits?

Time:03-04

I am trying to implement newtons algorithm to get square root of a number and well while it works , I can get accuracy only upto 14 places after decimal. I tried to use getcontext().prec but it didnt work. Here's the code

number = float(input("number less than 1000 : "))
li =[]
for i in range(32):

    if i*i <= number:
        li.append(i)

firstroot = (li[-1])
li.clear()

wants = int(input("how many digits: "))

count = 0

while count < wants:
    root = 0.5*(firstroot   (number/firstroot))
    firstroot = root
    count  = 1

answer = str(firstroot)
print(answer[0:wants 1])

even if I enter 50 in "wants" input I'll only get upto 14 decimal places. How to solve this?

CodePudding user response:

That's because firstroot is simply a float number, which has a fixed maximum precision. If you need more digits, you must either implement the calculation yourself or use a package like mpmath which already has done that for you.

In your special case, as you already have hand-implemented the square root from basic operations, you could alternatively use the standard library decimal and crank up its precision.

CodePudding user response:

You can use the decimal module for more precision in your variables (instead of the default float type).

import decimal

## Your code:
number = float(input("number less than 1000 : "))
li =[]
for i in range(32):

    if i*i <= number:
        li.append(i)

firstroot = (li[-1])
li.clear()

wants = int(input("how many digits: "))

##
decimal.getcontext().prec = wants
count = 0
wants = decimal.Decimal(wants)
number = decimal.Decimal(number)
firstroot = decimal.Decimal(firstroot)

half = decimal.Decimal(0.5)
## Your code:
while count < wants:
    root = half*(firstroot   (number/firstroot))
    firstroot = root
    count  = 1

print(firstroot)
# or a string:
str_firstroot = str(firstroot)
  • Related