Advanced Engrams for a Thinking Computer 9798734059920

Engrams are the computer programs of the brain. Engrams are written using the instruction set of the human brain as desc

135 40 2MB

English Pages 463 Year 2021

Report DMCA / Copyright

DOWNLOAD FILE

Advanced Engrams for a Thinking Computer
 9798734059920

Table of contents :
PREFACE

INTRODUCTION

1   ABSTRACT IDEAS

1.1 What is an idea?

1.2 Understanding Abstraction and Instantiation

1.3 What is an abstract idea?

1.4 Finding abstract ideas using the abstract idea recognition engram

1.5 Understanding State machines

1.6 Understanding Subroutines: Calling and Returning

1.7 The unification of instantiation/abstraction with call/return

1.8 The main state machine of the brain: abstract idea based hierarchical state machine (the basis of intelligence)

1.9 Moving from abstract idea state to abstract idea state using the “move abstract idea state” engram.

1.10 Abstract ideas used for stimulus/response

1.11 A parser constructed from an abstract word based state machine

1.12 The abstraction of abstract ideas

1.13 Using abstract ideas to answer questions

1.14 Vision and abstract ideas

1.15 Automatic abstract idea learning engram

1.16 No backwards abstraction

1.17 The Autostep State machine

1.18 Interrupts

1.19 Top level state machine

2   THE ABSTRACT IDEA RECOGNITION ENGRAM – UNDERSTANDING A SENTENCE OR SITUATION

2.1 Imagination and Understanding in a Thinking Computer

2.2 Main state machine abstract ideas

2.3 An abstract idea is a use of a data node

2.4 A sentence can activate multiple abstract ideas

2.5 Understanding sentences that are questions

2.6 Hierarchical sentence learning

2.7 Sequential aspects of phrases – combining phrases

2.8 Phrases and groups of phrases

2.9 Abstract ideas are used to check that a sentence makes sense

2.10 Blocking question sentence from activating when looking for a statement

2.11 Blocking abstract idea recognition when looking for a sentence

2.12 Abstract ideas that activate when reading a children’s story

3  SENTENCE TO SENTENCE UNDERSTANDING AND GENERAL UNDERSTANDING USING THE ABSTRACT IDEA BASED HIERARCHICAL STATE MACHINE

3.1 Understanding relationships between pairs of sentences

3.2 Understanding sentence relationships between a group of sentences with hierarchical movement

3.3 Multiple abstract ideas linked to a sentence

3.4 Sentence to sentence understanding without matching subjects

3.5 Multiple abstract idea based hierarchical state machines

3.6 Skipping past unrelated sentences when looking for sentence to sentence understanding.

3.7 The “get_history” engram

3.8 Standalone abstract ideas and groups of abstract ideas

3.9 Abstraction when looking for sentence to sentence understanding

3.10 Memory space reduction by using subject, verb, or object match

3.11 Multiple subject sentence to sentence understanding with independent situations

3.12 Multiple subject sentence to sentence understanding with a single situation.

3.13 Sentence to sentence merging

3.14 Skipping steps and pushing and popping multiple levels in the hierarchy during sentence to sentence understanding

3.15 The input sentence can activate multiple abstract ideas, each associated with a different abstract idea from previous sentences

3.16 Abstract ideas and noun groups

3.17 Checking for abstract idea transitions from past sentences

3.18 Detecting commonality in sentence groups

3.19 Sentence groups – situations and stories

3.20 Engrams for inferring what we are doing

3.21 Abstract distance and following along

3.22 Switching from recognition to recall

3.23 Abstract ideas that process numbers

3.24 Abstract ideas that process location

3.24.1  Mental Maps

3.24.2  Skipping abstract idea steps using understanding of location

3.24.3  Imagining a visual image

3.24.4  Multiple wheres

3.25 Abstract ideas that process situation based facts

3.26 Review of abstract idea links

3.27 General understanding

3.28 Understanding of interrupts

4   A PARSER BUILT FROM ABSTRACT WORD STATES ARRANGED AS A HIERARCHICAL STATE MACHINE

4.1 Words, phrases, clauses, and sentences

4.2 Engrams for processing noun phrases (nouns, adjectives, intensifiers, pronouns, and articles)

4.2.1  The Processing of Nouns by the Human Brain – the noun1 engram

4.2.2  Processing Adjectives and Intensifiers

4.2.3  Processing intensifiers/adjectives with conjunctions and comparing adjectives

4.2.4  Adjective abstraction

4.2.5  Processing Pronouns – the pronoun1 engram

4.2.6  Processing nouns that act like pronouns

4.2.7  Processing Articles

4.2.8  Singular and plural with abstract ideas

4.2.9  One and many

4.2.10  Noun groups

4.2.11  Adjective groups (multiple adjectives)

4.2.12  Adjectives – why we need subj_adj, obj_adj, prep_adj, iobj_adj

4.2.13  The Parser fills in missing pieces of a multipart sentence

4.2.14  Ambiguous Nouns

4.3 Engrams for processing verb phrases (verbs and adverbs)

4.3.1  Helping verbs

4.3.2  Verb tense

4.3.3  Agreement between subjects and verbs

4.3.4  Processimg Verbs – the verb1 engram

4.3.5  Verb infinitives

4.3.6  Prepositional phrase complements

4.3.7  Processing Adverbs

4.3.8  Verb abstraction

4.3.9  Phrasal verbs (multi word verbs)

4.3.10  Handling the adverbs “not”, “almost”, and other constraint adverbs.

4.3.11  Tense ambiguity

4.4 Engrams for processing prepositions

4.4.1  Some prepositions can have multiple meanings

4.4.2  Fixing a sentence so the engram SUMALLATRS finds the correct answer

4.4.3  There can be many copies of the same relation with different destinations.

4.4.4  Adjective noun phrase acting like a prepositional phrase

4.5  Engrams for processing participles

4.5.1  Participle noun pairs

4.6 Engrams for processing interrogatives

4.6.1  “Who” questions look for a person

4.6.2  Answers using instantiation data structures and sentence structures

4.6.3  Listing of the SUMALLATRS engram

4.6.4  Engrams using multiple strategies to answer questions

4.6.5  Engrams for handling “what else” questions

4.7 Engrams for processing sentence ending punctuation.

4.8 Engrams for processing conjunctions

4.8.1  Processing three clause sentences

4.8.2  Processing conjunctions with the phrase “so are”

4.8.3  Processing if-then sentences

4.8.4  Answering an adjective question that contains a conjunction

4.8.5  Answering a question with a conjunction

4.9 Engrams for processing numbers

4.10 Processing sentences with the word “that” and sentences with implied “that”

4.11 Ambiguous words

4.11.1  Backtracking and reparsing

4.11.1  Handling ambiguous words with a separate parser state for the ambiguous word

4.11.3  Word sequences with ambiguous meanings

4.12  Engram for biasing related words - word groups

4.13  Naming all instances of an abstraction

4.14  Recall and Recognition of  word sequences

4.15 Speaking

4.16 Silent speaking

4.17 Words, Vision, and Understanding

4.18 Sequence of engrams used to understand sentences and answer questions

5   LEARNING ABSTRACT IDEAS

5.1 Learning a new abstract idea by converting a sentence to an abstract idea

5.2 Learning a hierarchical sequence of abstract ideas: Automatic abstract idea learning at execute

5.3 Learning abstract ideas by following along

5.4 Engrams that fix engrams

5.5 Learning how to answer a question

5.6 Abstract ideas that fix other abstract ideas

5.7 Disabling an abstract idea

5.8 Correcting incorrect abstract ideas

6   THE AUTOSTEP ABSTRACT IDEA BASED HIERARCHICAL STATE MACHINE

6.1  The fundamental difference between understanding and work

6.2  The autostep state machine

6.3  The autostep state machine handling of interrupts

6.4  Learning the steps of an autostep sequence

6.5  Plans and Goals – deciding what to do

6.6  Conversational abstract ideas

6.7  Fast abstract idea movement using the autostep engram

6.8 Autostep state machine control of reading

6.9 Interaction between understanding and autostep

7   THE TOP LEVEL STATE MACHINE

7.1  Top level state machine handling of emotions

7.2 Top level abstract idea state movement

8    CHECKS

8.1 Abstract ideas perform checks

8.2 Abstract ideas check the combination of subject, verb, and object

8.3 Learning new checks

9   UNDERSTANDING THE ABSTRACT IDEA STATE MACHINE, THOUGHTS, AND SITUATIONS

9.1 Two abstract ideas are required to handle a bidirectional association

9.2 Multiple “isa” links

9.3 Detecting a match

9.4 Noticing and Explaining

9.5 Expanding and contracting understanding

9.6 Nested complex facts

9.7 The nature of consciousness and intelligence

9.8 Engrams for processing goals and generating plans

9.9 Engrams for figuring out situations, answers, and things.

9.10 Review of entry points for key engrams in engram.txt

10   REGRESSION

11   EXPERIMENTS DEMONSTRATING THE OPERATION OF ADVANCED ENGRAMS FOR A THINKING COMPUTER

11.1  SOFTWARE INSTALLATION

11.2  RUNNING THE ADVANCED MAIN ENGRAM EXPERIMENTS

11.3 PROGRAM OPERATION

11.4 FINAL COMMENTS

12   CONCLUSION

APPENDIX A:  THE COMPUTER ARCHITECTURE OF THE HUMAN BRAIN

A1.1   Recall

A1.2   Recognition

A1.3   Learning

A1.4   The Control Section of the Brain

A1.5   Parsing Sentences

A1.6  Instantiation

A1.7  Input and Output

A1.8   The Basic Instruction Set of the Human Brain

A1.9   Unification of the Data Section Structure and the Control Section Structure of a Traditional Computer

A1.10   The Emotion Section

A1.11

Polecaj historie