* This site is based on internet standards not supported by IE 6.
EC Container 5

Current Technology Capacity

The architecture determines whether required compute capacity is measured in thousand transactions per day... or hundreds... or tends towards zero



Warren Jones, Lana Rubalsky (2010) "Current Technology Capacity", wJones Research, August 10, 2010
The main objective of the stored purpose project was to develop a computer generally able to understand and serve. This led to design of facilities for general understanding and general technology manipulation. The understanding of the capacities and limits of available technology to implement intelligence comes from this design.

While some, such as Ray Kurzwell have discussed approximate limits to current technology and a date as to when such limits might be overcome, such projections were lacking an architectural model that asserted a specific logic and a derivative information processing requirements. As such these worthy efforts assumed a requisite worse case scenario in their calculations.

The Stored Purpose architectures assert two primary logic sets, an existence model for physically simple intelligent systems, and a cellular construction model for systems of greater complexity. The architectures use as their basis a symbolic information store of essence data in Platonic Form linked in context space and propagated to thought processes.

To illustrate the importance of architectures in determining required compute capacity, consider the computational requirements of coordination activity between a single lawyer supporting a client who is a human resources manager.
  • Stored Program Raw Results Based Data Processing - Using a classic stored program approach with office productivity applications, both the lawyer and client will edit employment contracts and correspondence in local programs during the workday and exchange program data in the form of files. As edits to documents hundreds of thousands of kilobytes in size are being saved and exchanged every few minutes, several distributed PC, server and communications routing computers are employed, each processing raw body of work in process. With automatic file backups from PC to server, related replication of correspondence to mobile devices, and replication of server data to offsite facilities, several hundreds of megabytes of data will transit each machine and communications routers between. Aggregate capacity required to support these two workers could be in the range of tens of billions of data packet transactions per day and would be primarily file transmission related.
  • Cloud Based Effort Data Processing - Using an alternative cloud computer architecture with remote office productivity applications, the lawyer and HR manager send only keyboard and mouse stroke records of their raw effort to remote computers and exchange results by way of collaborative web pages. As the total keyboard and mouse stroke data transmitted for a typical person’s workday is less than one megabyte, several gigabytes of file data exchange is eliminated. Although exchange and review of reference contract and background material is still computationally intensive, the aggregate capacity required would be reduced to hundreds of millions data transactions per day and is primarily communications related.
  • Stored Purpose Contextual Intent Processing - Using a stored purpose architecture with assistant class agents supporting both the HR Manager and the lawyer, raw data changes, as well as the requisite data transmission … tends toward zero. Although there is some communication related to agents monitoring the contextual status of the lawyer and manager, computation is primarily related to agent Understanding, and assistive Goal Pursuit. As both these processes are triggered primarily as a result of prediction failure, these too tend toward zero as the percentage of variable objects managed by the lawyer and HR manager (company resources and responsibilities) in semantic communication (aided by assistant agents) tends toward 100%. Thus in a Stored Purpose system, although the computational complexity of Understanding and Goal Pursuit processing is many times higher than communication oriented Stored Program processing, the net computation load in many business and government enterprises would be lower than current demand.
Note this is a simple illustration. There are weaknesses in current host processors that may constrain environments with large amounts of freewill valuation processing. These constraints will diminish once large cache cpu (i.e. x86) and gpu (i.e. cuda) technologies can be fabricated on a single die (without a bus bottleneck). The bottom line is that as the efficiency of the architecture increases, the overall capacity requirements decrease and Stored Purpose technologies will be more efficient than current stored program systems.

EC Container 6