Previously, we have explored methods to compute the essence of power functions $Ȣx^n$ which involves solving a large system of linear equations. This method is equivalent to solving for the inverse of a large $n\times n$ matrix where entries are values of pascal's triangle. Though the matrix method allow us to solve for large number of essences at once, it does not extend easily to solve for next iterations of essence coefficients. Rather than reusing the values we have already solved for, we will have to solve for inverse of a separate larger matrix again. Here we will introduce an iterative method for solving for these coefficients. Chapter 0: Recap Let us first remind ourselves of the definition of essence. For a function $f(x)$, we want to find the transformation $Ȣf(x)$ such that we are able to 'smooth out' its series: $$\sum_{i=a}^b f(i) = \int_{a-1}^b Ȣf(x) dx$$ For example, we can solve for the following functions: $$\begin{align*}Ȣ1 &= 1 \\ Ȣx &= x +...
Summations are fun, but they can be really tricky. Most often, the fun properties come from the complexity of terms defined within Sigma. What I became curious about is the properties of Sigma Summation itself: what interesting properties will Summation have when they are multiplied to each other, or even nested inside one other? That is the question we are going to explore with this article: tricks and properties of Summation Multiplication and Nested Summations! Before exploring our own properties, let us cover some of the more well known ones: $$\sum_{i=n}^m f(i) + \sum_{i=n}^mg(i) = \sum_{i=n}^m(f(i)+g(i))\\ \sum_{i=n}^mcf(i) = c\sum_{i=n}^mf(i)\\ \sum_{i=n}^m1 = m+1-n\\ \sum_{i=n}^mf(i) = \sum_{i=k}^mf(i) + \sum_{i=n}^{k-1}f(i)$$ With these simpler properties out of the way, let's begin with more exciting concepts: Multiplication $$\sum_{i=n}^mf(i) \times \sum_{j = a}^bg(j)$$ Let's start off simple: what comes after addition? Multiplication. The best way...