Hello, I am referring to the previous posts about the puzzle I proposed, along with some possible algorithmic solution.
I also drafted a proposal on how to split the problem across multiple concurrent processes.
At this point maybe I should write a few words on why we would want to do this and when it could become useful.
There are different paradigms that allow to split a problem into multiple sub problems with the purpose of leveraging on multiple available processors.
The Seti@home project efficiently does exactly this.
In my view we can classify these kind of problems in two main categories :
1) Problems that can be "hashed" into subsets (or "buckets") of data
2) Problems that cannot be immediately hashed since the state f(n+1) depends on f(n), therefore I could not ask a process to compute f(n+1) unless it already computed f(n)
Let's take some simple examples :
The equation that allows us to represent a line can be expressed as y = a +bx
This means that we can calculate directly y(x) for any value of x (being x a real number), so such problem falls easily into my category 1)
However, when we split the workload on different processors, we normally consider finite increments (this is common in real life problems), so that the variable x is a Natural Number (an integer)
Some other problems, which are common when we run simulations, require an iterative approach.
By that I mean that in order to calculate a function (or better a series) y(x) we need to apply a set of rules that take into account previous values of the serie itself.
A typical, well known example, could be the Fibonacci number series.
These fall in my category 2) and may be important in computing simulated scenarios.
Still there is a distinction to be made within category 2 :
Series like the Fibonacci one have one possible values at each i step, such as f(i) is defined and has a single solution.
When exploring simulated scenarios, normally f(i) presents several possible solutions, as visible in the description of the proposed puzzle.
While this is indeed making the problem harder to tackle, it also opens (eventually) up to parallel processing.
In fact, since f(i) has multiple values, then f(i+1) is also going to have multiple values (in general more than f(i) unless f is converging) meaning that each value f(i) can be considered the start of a new problem and then individually assigned to a separate processor.
What could be a practical example of a problem benefiting from this approach?
I am thinking about models used to simulate a pandemic spread, or the evolution of a meteorological event. Any model that aims to test all the possible evolution of something which is not directly predictable using a mathematical function.
I don't think I invented this approach, I am sure there is a lot of great work done already, I am just curious to try it myself
No comments:
Post a Comment