Logical vs Physical Design

Back in the bad old days life was pretty simple. Applications lived in a single computer process. Then came the idea that multiple processes could be used to attack certain problems. This rapidly evolved to encompass the use of multiple computers. Moore's law (and networking) made it seemingly inevitable.

While hardware advances made horizontal and vertical scaling possible, software lagged. More recent high-level languages have included support for distributed computing (e.g., the Java Remote interface), but these required that the developer decide, during application development, how the application was to be distributed. Changing the physical distribution of the application was a huge pain, since it touched a lot of code. Even middleware technologies suffered from this shortcoming.

The problem is the combination of writing a logical application (defining what is to be done), and physically partitioning it (defining where particular tasks are done). Ideally, these ought to be two separate steps, such that physical distribution of the application doesn't affect the logic of the application at all. In a really cool universe, the distribution can be changed on-the-fly, at run-time.

The closest I've seen any language come to this ideal was the Forte O-O 4GL, TOOL. You developed the application using a repository full of the code artifacts you needed for you application. After designing and debugging the application logic, usually in a single "node" environment (a single address space), you went through a separate "partitioning workshop", where you decided where the pieces of the application would run in your particular environment. This didn't affect the application code at all; application logic and physical distribution were completely orthogonal.

Those Forte Software guys were very clever. I joined Forte in 1997, when they were already working on R3 of this stuff. To me it was like magic. I'd spent years creating distributed C/C++ apps. If I'd had Forte, back then, I'd have been able to create those distributed applications in a fraction of the time, and I'd probably have less grey hair today!

But time marches on. The creation of distributed applications is changing yet again. The olde way of creating large applications is from reusable code entities, frameworks, and even generated code (MDA, anyone?), and partitioning the result for execution in a distributed environment. The new way is termed "composition application development", and is founded on service-oriented architecture. Applications are now (largely) aggregations of (reusable) services, plus some connective glue. Services provide coarse-grained functions. Like the Forte 4GL, distribution of functions (services) within a SOA is orthogonal to the actual application logic. Calling a locally provided service is semantically identical to calling a remotely provided one. (This is one of the reasons JBI has only a single service invocation API, which hides locality from the consumer & provider.)

Developing large applications is tough enough. Being able to disregard distribution issues during development and maintenance is truly a blessing to the developer, who has enough things on his plate. Forte Software managed to deliver such benefits to developers using the vehicle of a proprietary 4GL. Today we can realize the same advantages in application development using standards-based SOA.

Comments:

Post a Comment:
Comments are closed for this entry.
About

rtenhove

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today