November 11, 2011 § Leave a comment
NooLabGlue is a framework to link applications or parts of applications. There are of course already a lot of them, currently O’Reilly lists 185 different ones in their P2P directory (although they include also very low-level items such as XML-RPC).
NooLabGlue is different. The paradigm is aligned to natural neural systems. Thus, its primary emphasis is not on throughput; instead, NooLabGlue tries to support the associative, probabilistic, adaptive linking of a growing population of Self-Organizing Maps, where the individual SOMs tend to diverge regarding their “content.”
Naturally, NooLabGlue is a framework for massively modular neural systems, where parallelity may occur at any level. Such artificial neural systems may be realized as Self-Organizing Maps (SOM, Kohonen-Map), or as more traditional ANN. Yet, NooLabGlue also allows to link traditional components or applications in the EAI context or its usage as a simple service bus. NooLabGlue transcends the client/server paradigm and replaces it by sources/receptor paradigm before a communicological background.
Such, not only people behind their computing machines can be connected, but more or less autonomous entities of software. Yet, NooLabGlue is not a middleware mainly designed to to distribute a single, particular, well-identified learning task, or even to serve as an infrastructure to distribute the training of an “individual” SOM. These kinds of goals are much too narrow-minded. They just virtualize a Turing-Computation-task to run on many machines instead of one. In contrast to that we are looking for a middleware that is supporting a growing population of probabilistic connected SOMs in the context of Non-Turing-Computation, which requires strikingly different architectural means for the middleware.
The level of integration follows a strikingly different paradigm as compared to other packages or approaches, starting from SOAP, webservices, RMI and up to systems like Akka. The contracting is not accomplished on the level of the field (which is the reason for complexity in SOAP and WS), but on the level of document types, behaviors, and names. Related to that is the plan to incorporate the ability for learning into the MessageBoard.
NooLabGlue follows an explicit transactional approach, which is (with the exception of the transaction id) completely hidden on the level of the frontend-API. Available transports are udp, tcp, ftp, restful http (implemented using the Restlet framework), while everything is wrapped into xml. The complexity of those protocols is completely hidden on the level of the API. Supporting different transport protocols allows for speedy connections in a LAN, while at the same time the whole framework also can run over the WWW/WAN; actually, MessageBoards (the message servers) allow for cascading a message (even transactional messages) from local to remote segments of a network.
The API for participants (“clients” of the MessageBoard) is very simple; it just provides a factory method for creating an instance, as well as “dis/connect(),” “send(),” and a callback for receiving the messages. Sending a (transactional) message is realized as a service, i.e. as a persistent contract to the “future” (or in short, as a future). Participants and even the MessageBoard could be shut down and restarted without loosing a (fully transferred) message. Lost connections are re-established without cookies on the basis of (optionally) unique names.
Both, the participants as well as the MessageBoard are running in a multi-threaded manner, of course, for all kinds of transport layers. Language of realization is Java, though any language could be used to connect to the MessageBoard. In principle, the MessageBoard could be realized also using php, for instance, since the server does not store or access any binary object.
NooLabGlue has been created in the context of advanced machine-learning that proceeds ways beyond the issue of the algorithm. The goal (within next 6 weeks) is a “natural account” of understanding (e.g. language) based on the paradigm of machine-based epistemology.