Examples

Examples of Functional Cognitive Architecture.

Failure Example

Note: this happened in the past and is recreated from memory – I think it is a good example of how my lack of social salience and logical approach can fail.

This is an example that shows how my cognitive architecture is very different than what an NT person experiences and not in a very good way (in this case).

I received a call late one night and the caller was very upset (crying) and had questions about programming.

I could hear crying and identify that as a high magnitude signal for being upset. I also had a question that I knew how to answer. I don’t recall if I asked why the caller was upset (in any event I didn’t get an answer). Since I didn’t have any sadness to debug, my Manual Frame Construction went for:

question about software, rule: software expert, provide answer

Functional logic modeling quickly provided the shortcut for answering programming questions, I looked up the information in my memory (facts) and my propositional logic performed the delivery of the information.

To an NT person this probably sounds very cold or even cruel. I don’t have affective empathy so I can’t pick up the signal that is very obvious to most people. This was a non-sequitur in my mind – two pieces of incongruent information and I picked the one that I could easily deal with. The upset fact was discarded as irrelevant. I don’t intend to be cold, but it is a fact of having no Social Salience.

Note: Signal Recognition is not Signal Integration.

While I recognized the crying as a signal of distress, it remained Inert Data. In my cognitive architecture, a signal must have a corresponding Functional Script to be actionable. I lacked the “Social Co-regulation” (note: I discovered this was a thing while writing these notes) script necessary to address emotional distress; therefore, the signal was a known fact with no logical “If/Then” destination.

I proceeded with the programming query because it was the only data point for which I possessed a Functional Logic Model. My response was an optimization of the only variable I could compute (the technical question), not a rejection of the person’s emotional state. The “Distress” was a recognized variable that the system simply lacked the software to process.

Typical Example

I made notes about this as soon as I got home – it was still fresh in my mind and I was paying attention to my thought process. I think this is a typical interaction. Low stakes but not no stakes.

I was driving home from work and I got a call from my wife who is also driving home from somewhere else. She asked “I need to go to the store and get some bread. Do you want to meet up and go with me and get food while you’re out?”

Enter Manual Frame Construction. What is going on? I assembled this while she was talking:

  • go to store
  • meet up and eat out

Pretty simple and roughly complete in terms of framing the situation. I used a low fidelity Functional Logic Model to compute a plan:

  • does she need me to go to the store with her?
    • she asked
    • and offered food – a sure enticement
    • but I think no, just being nice
  • food?
    • i want to eat out, always
    • but I do have some food at home that I was thinking of
  • logistics
    • complicated, probably requires waiting
    • going to store – not my favorite
  • not required, want food but effort exceeds reward and food at home is ok

Execution stage:

The logic looks something like this (via introspection – asking myself why did I say that?):

IF decision no THEN polite decline
IF low likelihood no THEN give alternative plan

I responded “No thanks, I will eat at home.” and as I was saying it I delivered new information to myself – the feedback loop. Maybe I need to justify my decision to avoid questions or argument.

Back to Functional Logic Model:

  • questions/argument possible, justify
    • other reasons to be at home?
    • unknown arrival time if go to store
    • dog needs food
    • dog needs walking

Execution Stage again:

IF value(reason) > utility(declined event) THEN state

and added on “I will feed the dog and walk him”. This was done more or less without pause as I was speaking. I had to manually think it but it wasn’t complex and I could immediately add that. Was there a pause? Not that I am aware of, probably not more than a few commas worth.

Note: if I had not thought of providing a reason why and was asked I would probably say “I don’t want to”. This is also fine to me, but I suspect more likely to cause Friction than having a reasonable reason. This would be a case where I didn’t have the model ready but had to provide a decision/opinion.