This notebook implements on the GPU a two-dimensional solution of the Kuramoto-Sivashinsky equation (KSE),
The KSE arises in a number of contexts and was dervied by Yoshiki Kuramoto and Gregory Sivashinsky while researching laminar flame front instabilities. It’s one of the simplest partial differential equations exhibiting complicated dynamics, displaying chaotic behavior in large domains. It’s particularly interseting because it’s a chaotic PDE of one variable. Contrast this with the Lorenz Attractor which is a chaotic ODE of three variables.
For the fully immersive version, see the WebGL visualization here. This page explores modernization to WebGPU and Observable Notebook Kit, as well as adding cleanup like plot axes (at last!).
Solution method
This notebook follows the solution method outlined in A. Kalogirou’s thesis, Nonlinear dynamics of surfactant-laden multilayer shear flows and related systems, particularly equations (F.8) - (F.10). The solution uses an implicit-explicit Backward Differentiation Formula in the spatial frequency domain.
The equation is solved in the spatial frequency domain, with the exception of the nonlinear term which requires computing the gradient while transforming back to the spatial domain, then squaring, then transforming back to the frequency domain.
A few implementation notes:
- Equation (F.10) seems to be missing a factor of in the biharmonic term.
- Since all terms include derivatives, the offset of has no effect and can simply be removed by zeroing out the mean (zero-wavenumber) component on every update.
- The multi-step method is initialized with the same values for both previous steps, rather than implementing a special Backward Euler initialization step.
The domain has size , but the equation is solved in the domain via the rescaling in Chapter 9 of Kalogirou’s thesis,
along with the factors and .
From Appendix F on page 227, the full second order spatial frequency domain update equation for solution at step as a function of the data from previous steps and is
where
where represents the spatial Fourier Transform and is the spatial domain solution.
Finally,
using the definition .
At first this update equation seems imposing, but if and refer to a particular wavenumber then the above is a simple algebraic expression for each grid point, independent of all others. The only exceptions are the expressions for and , which represent the solution, differentiated in the frequency domain via multiplication by and , inverse-FFT’d into the spatial domain, squared, and then FFT’d back into the spatial frequency domain. From there, the rest is tedious but straightforward shuffling of buffers.
WebGPU implementation
WebGPU has no built-in FFT, so this notebook implements Cooley-Tukey radix-2 in compute shaders: bit-reversal at the input, then butterfly stages. For grids larger than the maximum workgroup size (256 on most devices), a hierarchical four-step FFT breaks the transform into smaller pieces. Getting the indexing, normalization, and twiddle factors right is tedious, but at least FFT bugs are obvious.
Wavenumber layout in DFTs is also error-prone. The first half of the output contains positive frequencies; the second half contains negative frequencies in reverse order. A wrong sign or off-by-one index makes derivatives blow up fast.
Taking the complex FFT of real data causes numerical drift. The imaginary part should stay zero, but floating point errors leak energy into it over time, eventually destabilizing the solution. The fix is an extra pass per time step to extract the real part before transforming back. A real-valued Hartley transform would avoid this.
Finally, you may or may not like this, I might lose followers, etc., but I converted this notebook from WebGL 1 in part way of Claude Code, using my experimental MCP server for Observable Notebook Kit. It implements a small server which controls an in-browser dev preview over a WebSocket, querying runtime outputs, modifying control inputs, analyzing the variable dependency graph, and even analyzing image output. I’m not religious about AI in either direction. It’s good at what it’s good at. It’s evil where it’s used badly, and that’s that.