mscroggs.co.uk
mscroggs.co.uk

subscribe

Blog

Runge's Phenomenon

 2018-09-13 
This is a post I wrote for round 2 of The Aperiodical's Big Internet Math-Off 2018. As I went out in round 1 of the Big Math-Off, you got to read about the real projective plane instead of this.
Polynomials are very nice functions: they're easy to integrate and differentiate, it's quick to calculate their value at points, and they're generally friendly to deal with. Because of this, it can often be useful to find a polynomial that closely approximates a more complicated function.
Imagine a function defined for \(x\) between -1 and 1. Pick \(n-1\) points that lie on the function. There is a unique degree \(n\) polynomial (a polynomial whose highest power of \(x\) is \(x^n\)) that passes through these points. This polynomial is called an interpolating polynomial, and it sounds like it ought to be a pretty good approximation of the function.
So let's try taking points on a function at equally spaced values of \(x\), and try to approximate the function:
$$f(x)=\frac1{1+25x^2}$$
Polynomial interpolations of \(\displaystyle f(x)=\frac1{1+25x^2}\) using equally spaced points
I'm sure you'll agree that these approximations are pretty terrible, and they get worse as more points are added. The high error towards 1 and -1 is called Runge's phenomenon, and was discovered in 1901 by Carl David Tolmé Runge.
All hope of finding a good polynomial approximation is not lost, however: by choosing the points more carefully, it's possible to avoid Runge's phenomenon. Chebyshev points (named after Pafnuty Chebyshev) are defined by taking the \(x\) co-ordinate of equally spaced points on a circle.
Eight Chebyshev points
The following GIF shows interpolating polynomials of the same function as before using Chebyshev points.
Nice, we've found a polynomial that closely approximates the function... But I guess you're now wondering how well the Chebyshev interpolation will approximate other functions. To find out, let's try it out on the votes over time of my first round Big Internet Math-Off match.
Scroggs vs Parker, 6-8 July 2018
The graphs below show the results of the match over time interpolated using 16 uniform points (left) and 16 Chebyshev points (right). You can see that the uniform interpolation is all over the place, but the Chebyshev interpolation is very close the the actual results.
Scroggs vs Parker, 6-8 July 2018, approximated using uniform points (left) and Chebyshev points (right)
But maybe you still want to see how good Chebyshev interpolation is for a function of your choice... To help you find out, I've wrote @RungeBot, a Twitter bot that can compare interpolations with equispaced and Chebyshev points. Since first publishing this post, Twitter's API changes broke @RungeBot, but it lives on on Mathstodon: @RungeBot@mathstodon.xyz. Just tweet it a function, and it'll show you how bad Runge's phenomenon is for that function, and how much better Chebysheb points are.
For example, if you were to toot "@RungeBot@mathstodon.xyz f(x)=abs(x)", then RungeBot would reply: "Here's your function interpolated using 17 equally spaced points (blue) and 17 Chebyshev points (red). For your function, Runge's phenomenon is terrible."
A list of constants and functions that RungeBot understands can be found here.
                        
(Click on one of these icons to react to this blog post)

You might also enjoy...

Comments

Comments in green were written by me. Comments in blue were not written by me.
Hi Matthew, I really like your post. Is there a benefit of using chebyshev spaced polynomial interpolation rather than OLS polynomial regression when it comes to real world data? It is clear to me, that if you have a symmetric function your approach is superior in capturing the center data point. But in my understanding in your vote-example a regression minimizing the residuals would be preferrable in minimizing the error. Or do I miss something?
Benedikt
                 Reply
 Add a Comment 


I will only use your email address to reply to your comment (if a reply is needed).

Allowed HTML tags: <br> <a> <small> <b> <i> <s> <sup> <sub> <u> <spoiler> <ul> <ol> <li> <logo>
To prove you are not a spam bot, please type "c" then "o" then "s" then "i" then "n" then "e" in the box below (case sensitive):

Archive

Show me a random blog post
 2024 

Feb 2024

Zines, pt. 2

Jan 2024

Christmas (2023) is over
 2023 
▼ show ▼
 2022 
▼ show ▼
 2021 
▼ show ▼
 2020 
▼ show ▼
 2019 
▼ show ▼
 2018 
▼ show ▼
 2017 
▼ show ▼
 2016 
▼ show ▼
 2015 
▼ show ▼
 2014 
▼ show ▼
 2013 
▼ show ▼
 2012 
▼ show ▼

Tags

matrix of cofactors palindromes electromagnetic field oeis books a gamut of games pac-man dates programming braiding phd logo convergence royal institution triangles coins trigonometry squares polynomials golden spiral people maths chebyshev accuracy ternary interpolation golden ratio christmas card plastic ratio sobolev spaces quadrilaterals football asteroids gather town manchester science festival php standard deviation propositional calculus stirling numbers wave scattering estimation bodmas puzzles determinants map projections crochet inline code folding tube maps countdown chalkdust magazine draughts world cup pizza cutting graph theory preconditioning matrices pi big internet math-off correlation boundary element methods noughts and crosses simultaneous equations fonts captain scarlet numbers geometry tennis mathsjam european cup anscombe's quartet hats gerry anderson weather station bubble bobble final fantasy video games london underground weak imposition matrix of minors turtles matrix multiplication cambridge arithmetic christmas game show probability dragon curves data reuleaux polygons ucl the aperiodical databet frobel geogebra finite group fractals talking maths in public curvature fence posts zines go live stream rhombicuboctahedron sound data visualisation python sorting radio 4 tmip sport numerical analysis crossnumber chess light javascript newcastle realhats logs latex craft finite element method machine learning manchester platonic solids speed hannah fry gaussian elimination hexapawn london pascal's triangle graphs computational complexity stickers wool edinburgh recursion flexagons matt parker menace martin gardner logic bempp mean harriss spiral game of life pi approximation day exponential growth news inverse matrices misleading statistics 24 hour maths rugby mathslogicbot advent calendar pythagoras datasaurus dozen raspberry pi youtube cross stitch dinosaurs binary folding paper national lottery royal baby probability games signorini conditions runge's phenomenon reddit hyperbolic surfaces nine men's morris approximation guest posts mathsteroids error bars statistics dataset errors

Archive

Show me a random blog post
▼ show ▼
© Matthew Scroggs 2012–2024