Thought and Memory

Short-term and Long-term Memory

Attribute Short-term Memory

Long-termMemory

Capacity 27 230
Decay type Linear with time Logarithmic with time
Decay rate Rapid Slow

Forgetting

Items may be forgotten by a memory that is nearing its capacity. In general, the least important items in the memory are forgotten to make room for new items. Importance is initially assigned using some algorithm related to the likely relevance of the information to the entity possessing the memory. Thereafter, importance decays with time according to some algorithm (see decay type and decay rate in the chart above). Importance can also be enhanced by specific memory events (see Remembering, below).

Remembering

The memory recall process has a probabilistic aspect. An important fact will nearly always be recalled whenever it might apply. An unimportant fact may or may not be recalled at any particular time, even though it has not been completely forgotten.

Simply recalling an item during the thought process enhances the item's importance in a minor way. Actually using the item to derive the answer to a question or to solve a problem enhances its importance to a much greater degree. The algorithm used to determine enhancement of an item's importance is probably different in long term memory than it is in short term memory.

Implementation Classes

EntityMemory

EntityMemory is the class which represents the associative memory of an entity.

Data

longTermMemory AssociativeMemory
shortTermMemory AssociativeMemory
shortTermToLongTerm map< ulong, ulong >
  Maps short-term knowledge item id to long-term knowledge item id.
workingKnowledge (TBD)
  This is intended as the place where knowledge from short-term and long-term memory is combined; the playground of the reasoning engine.

AssociativeMemory

AssociativeMemory is the class which implements associative memory.

Data

associationOneElement map< ulong, list< ulong > >
  Maps a knowledge subject id to a list of knowledge item ids that reference the subject.
associationTwoElement map< ulong, map< ulong, list< ulong > > >
  Maps an ordered pair of subject ids to a list of knowledge item ids that reference the subjects in the required order.
capacity ulong
  Number of knowledge items which the memory can hold. Note that this is not a hardware or data representation limit, but a configuration parameter. For example, a senile person can be roughly modeled as having a perfectly normal long-term memory, but having a short term memory significantly smaller in capacity than is usual.
forgetfulness double
  The threshold of importance below which an item is susceptible to being forgotten.
knowledge map< ulong, KnowledgeItem >
  Where the non-associational part of knowledge is stored.
nextKnowledgeItem ulong
  Knowledge item id to be used for the next knowledge item.
nextKnowledgeSubject ulong
  Knowledge subject id to be used for the next knowledge subject.

 

KnowledgeItem

KnowledgeItem is the class which implements a fact, supposition, conclusion, et cetera.

Data

credibility double
importance double
source list< ulong >
  List of knowledge subject ids which are the ids of the sources of the knowledge.
subject vector< ulong >
  Compactly-ordered array of knowledge subject ids.  
timeLastAcquired datetime
timeLastUsed datetime
type enum
  Determines the significance of each of the subject positions and will influence how the knowledge item is used. For example, if the knowledge type is "Noun-Relationship-Noun," then the first subject is a noun which has the relationship indicated by the second subject with the noun indicated by the third subject.

Types of Knowledge

Noun-Relationship-Noun

Bob-loves-Judy

Used to specify a relationship between two nouns. Note that relationships are not necessarily reflexive. In particular, a relationship can typically be mapped to some inverse relationship. From "Bob-loves-Judy" we can infer that "Judy-is loved by-Bob" but not "Judy-loves-Bob" even though we may hope that this is the case.

Inverse relationships may be stored as Relationship-inverse-Relationship, as a particular instance of NRN. Example: "loves-inverse-is loved by." Because the inverse relationship is itself reflexive, it makes sense to store the knowledge that "inverse-inverse-inverse," e.g. If the inverse of "loves" is "is loved by" then the inverse of "is loved by" is "loves."

Noun-Attribute-Value

Chris' car-color-white

Used to assign a value for an attribute of a subject.

Noun-Attribute-Numeric value-Unit

My jug-capacity-2-liter

Form of NAV to be used when units are known or implied.

Unit-Conversion factor-Unit

liters-0.2642-gallons(US liquid)

Form of NRN to relate two value units. Because it is a relationship, it can be inverted by the reasoning engine if necessary; this is accomplished by taking the multiplicative inverse of the conversion factor: "gallons(US liquid)-3.785-liters." A suitably complex reasoning engine can also chain the relationships together; from "liters-0.2642-gallons(US liquid)" and "gallons(imperial)-1.20095-gallons(US liquid)" the engine could invert the second item to obtain "gallons(US liquid)-0.83267-gallons(imperial)," then combine the items to obtain "liters-0.22-gallons(imperial)." [More complex conversions, alas, can not be handled in quite this way (e.g. RGB color to HSV or CIE chromaticity, Cartesian to polar coordinates, Fahrenheit to Centigrade).]

Using these concepts, the question "How many imperial gallons does my jug hold?" can be reformulated as a search for a knowledge item of the form "My jug-capacity-?-gallons(imperial)". In the absence of a direct match to the desired knowledge, the item can be constructed from the facts of the above example, i.e.:


Back to the Destiny documents