Argumentation and Realization in AI Master Programmes

University of Cyprus
Autumn 2022
C
O
G
N
I
T
I
V
E
 
P
R
O
G
R
A
M
M
I
N
G
 
F
O
R
H
U
M
A
N
-
C
E
N
T
R
I
C
 
A
I
A
n
t
o
n
i
s
 
K
a
k
a
s
1.
R
e
a
l
i
z
a
t
i
o
n
s
 
o
f
 
C
o
m
p
u
t
a
t
i
o
n
a
l
 
A
r
g
u
m
e
n
t
a
t
i
o
n
 
L
e
c
t
u
r
e
 
1
S
t
r
u
c
t
u
r
e
d
 
A
r
g
u
m
e
n
t
a
t
i
o
n
2
Reminder
Argumentation Process
3
<Args, 
ATT
> or <Args, 
Att
, 
Def
>
Step 1:
 Construction 
of Arguments
I.e. Construction of 
Args
Step 2: 
Evaluation 
of Arguments
Acceptability/Validity of argument sets.
Construction of 
Arguments
4
 
What is an 
argument
?
An 
argument
 is a 
LINK
 between two
pieces of information: 
premises
 and
position
 (or 
claim
) of the argument.
a1=(bird; fly)
A 
Link,
 
not
 a Rule!
Construction of 
Arguments
5
 
Arguments are constructed as
instantiations
 of 
argument schemes
As
=(
Premises
; 
Position
)
Argument Schemes 
are “
programmed
or 
learned
 from data analysis or
experience
Realization
 of Argumentation
<Args, 
ATT
>  
OR
 <Args, 
Att
, 
Def
>
6
A 
realization
 or a 
structured
argumentation framework 
of an
argumentation framework is:
<AS, Cf, St>
AS 
is a set of 
argument schemes
Cf 
is a
 conflict relation 
on the statements
St 
is a
 strength/preference 
relation on AS
Realization of Argumentation
Realization of Argumentation
Realization
 of Argumentation
9
Realization
 of Argumentation
10
From
 the 
philosophical roots 
of argumentation.
Given 
<AS, Cf, St>
 
then
 
“a1
 
attacks
 
a2”:
a1, a2 are in 
conflict under 
Cf 
and
 named:
Rebuttal
 
if 
conflicting positions
 of a1 and a2
.
Undermine
 
if
 
a1 
conflicts 
the 
premises
 of a2
.
Undercut
 
if
 conflict 
between the 
argument
schemes
 of a1 and a2
.
Ex
a
mple
 
of
 
Realizing
 
Argumentation
(See earlier lecture)
Th
e
 
pow
e
r
 
cut
 
had
 
turne
d
 
the 
house
 
i
nto
 
da
r
kness.
 
      B
o
b ca
m
e
 
ho
m
e
 
and
 
turne
d
 
on
 
the 
li
gh
t switch.
 
A
r
g
s
 
=
{a1,a2,a3} 
constructe
d
 
b
y
 
common sense schemes
:
a
1
=
{t
u
rn_o
n
_s
w
i
tch
 
cause
s
 
li
ght_o
n
,
   
li
ght_on
 
cause
s
 
 
da
r
knes
s
}
 
U
{
turn_on_s
w
i
tc
h
@
T
}
a
2
=
{power_cut
 
cause
s
 
 
e
l
e
ctr
i
c
it
y
,
   
 
e
l
e
ctr
i
c
it
y
 
implies
 
 
li
ght_
o
n} U
{
p
o
we
r
_cut@
T
}
a
3
=
{dark
n
ess@T
 
implies 
da
r
kness@T
+
}
 
U
{
da
r
kness@T
}
Argument 
schemes
 here are given names: “
causes”
 and “
implies”
a1 
support
s 
 
darkness@T
+
 
 
;
   
a3
 
support
s
 
darkness@T
+
Another Example
(from Cognitive Science)
12
 
By
r
ne
s
 
(1989)
 
Suppression
 
T
ask
Suppression Task (Bryne, 1989)
The factual information given along with the conditional(s) in
each of the groups can change:
She has an essay to finish       She does not have an essay to finish
She has studied late in the library      She did not study late in the library
By
r
ne
s
 
(1989)
 
Suppression
 
T
ask:
 
She
 
has
 
an
 
ess
a
y
 to finish
 
If she has an ess
a
y to finish, then she will study late in the lib
r
a
r
y
 
She has an ess
a
y to finish
What 
f
oll
o
ws?
1.
She will study late in the lib
r
a
r
y
2.
She will not study late in the lib
r
a
r
y
3.
She 
m
a
y or 
m
a
y not study late in the lib
r
a
r
y
96%
Modus 
P
onens/ 
Ded
uction
By
r
ne
s
 
(1989)
 
Suppression
 
T
ask:
 
She
 
has
 
an
 
ess
a
y
 to finish
 
If she has an ess
a
y to finish, then she will study late in the lib
r
a
r
y
 
If she has a t
e
xtbook to read, then she will study late in the lib
r
a
r
y
 
She has an ess
a
y to finish
What 
f
oll
o
ws?
1.
She will study late in the lib
r
a
r
y
2.
She will not study late in the lib
r
a
r
y
3.
She 
m
a
y or 
m
a
y not study late in the lib
r
a
r
y
96%
Modus 
P
onens/ 
Ded
uction
 is not affected.
By
r
ne
s
 
(1989)
 
Suppression
 
T
ask:
 
She
 
has
 
an
 
ess
a
y
 to finish
 
If she has an ess
a
y to finish, then she will study late in the lib
r
a
r
y
 
If the lib
r
a
r
y is open, then she will study late in the lib
r
a
r
y
 
She has an ess
a
y to finish
What 
f
oll
o
ws?
1.
She will study late in the lib
r
a
r
y
2.
She will not study late in the lib
r
a
r
y
3.
She 
m
a
y or 
m
a
y not study late in the lib
r
a
r
y
By
r
ne
s
 
(1989)
 
Suppression
 
T
ask:
 
She
 
has
 
an
 
ess
a
y
 to finish
 
If she has an ess
a
y to finish, then she will study late in the lib
r
a
r
y
 
If the lib
r
a
r
y is open, then she will study late in the lib
r
a
r
y
 
She has an ess
a
y to finish
What 
f
oll
o
ws?
1.
She will study late in the lib
r
a
r
y
2.
She will not study late in the lib
r
a
r
y
3.
She 
m
a
y or 
m
a
y not study late in the lib
r
a
r
y
38%
Humans seem to suppress pr
e
viously d
r
a
wn
 in
f
o
r
mation.
Th
e
y reason non-monotonically
!
B
y
r
n
e
s
 
(
1
9
8
9
)
 
S
u
p
p
r
e
s
s
i
o
n
 
T
a
s
k
 
i
n
 
A
r
g
u
m
e
n
t
a
t
i
o
n
 
F
O
R
M
A
L
I
Z
T
I
O
N
 
O
F
 
T
H
E
 
H
U
M
A
N
 
R
E
A
S
O
N
I
N
G
 
I
N
 
A
R
G
U
M
E
N
T
A
T
I
O
N
G
R
O
U
P
 
1
:
If she has an ess
a
y to finish, then she will study late in the lib
r
a
r
y
She has an ess
a
y to finish
a1: HasEssay                     StudyLibrary
a1
 supports 
StudyLibrary   
(when given has an essay)
a1
B
y
r
n
e
s
 
(
1
9
8
9
)
 
S
u
p
p
r
e
s
s
i
o
n
 
T
a
s
k
 
i
n
 
A
r
g
u
m
e
n
t
a
t
i
o
n
 
F
O
R
M
A
L
I
Z
T
I
O
N
 
O
F
 
T
H
E
 
H
U
M
A
N
 
R
E
A
S
O
N
I
N
G
 
I
N
 
A
R
G
U
M
E
N
T
A
T
I
O
N
G
R
O
U
P
 
2
:
If she has an ess
a
y to finish, then she will study late in the lib
r
a
r
y
 
If she has a t
e
xtbook to read, then she will study late in the lib
r
a
r
y
She has an ess
a
y to finish
a1: HasEssay                        StudyLibrary
a2: HasTextBook                      StudyLibrary
h_a3: {}                        HasTextBook
a1
 supports 
StudyLibrary
a2 
does not support 
its possible claim
a2’= {a2,h_a3} 
supports
 StudyLibrary
But 
no attacks (no conflicts)
!
a1
a2’
B
y
r
n
e
s
 
(
1
9
8
9
)
 
S
u
p
p
r
e
s
s
i
o
n
 
T
a
s
k
 
i
n
 
A
r
g
u
m
e
n
t
a
t
i
o
n
 
F
O
R
M
A
L
I
Z
T
I
O
N
 
O
F
 
T
H
E
 
H
U
M
A
N
 
R
E
A
S
O
N
I
N
G
 
I
N
 
A
R
G
U
M
E
N
T
A
T
I
O
N
G
R
O
U
P
 
3
:
If she has an ess
a
y to finish, then she will study late in the lib
r
a
r
y
 
If the lib
r
a
r
y is open, then she will study late in the lib
r
a
r
y
She has an ess
a
y to finish
a1: HasEssay                   StudyLibrary
a2: OpenLibrary                   StudyLibrary
a3: not OpenLibrary                  not StudyLibrary
h_a4: {}                 not OpenLibrary
a5= {h_a4, a3} 
acceptable argument 
supports
 
not
 StudyLibrary
    a5 attacks a1 but not vice versa
!
h_a6:{}                 OpenLibrary
      {a1, h_a6} 
acceptable
       argument 
for StudyLibrary
a1
a5
a6
NL Comprehension
21
Text (Story) Comprehension
http://cognition-srv1.ouc.ac.cy/~adamos.koumis/star.html
http://cognition-srv1.ouc.ac.cy/~adamos.koumis/index.html
PART 3
22
 
COMPUTATIONAL ARGUMENTATION in
PRACTICE
Applications as 
Argumentation
based
 
Decision Making
23
Decision of O (or Derive Conclusion 
φ
):
Argument 
for
 O 
(or 
φ
)
No 
argument
 
for
 another O’ 
(or 
¬
φ
)
Through
 “Good Quality” 
arguments, i.e.:
Acceptable 
arguments
Practical Application
of Argumentation
24
Populate
 a Realization 
<AS, C, St>
Argument/Knowledge 
engineering/acquisition
Consider computational heuristics 
in the dialectic
argumentation process
Cognitively 
based (sometimes)
Populate
 
<AS, C, St>
25
The 
challenge
 is to capture:
Contextual 
Strength/Preference relation 
St
St
 is 
not global 
Context dependent
Hence we need to 
decide
 on the strength 
while deciding
on the Option to choose!
Two intertwined decisions
Arguing 
about
 Options 
reduces to 
arguing 
about the
strength of arguments
 supporting the Options
Decision Making in Argumentation
Knowledge (SBPs) 
for
 Decision Making
General, 
Cognitive Form 
of 
Knowledge
:
Generally
,
 in 
SITUATION
 
prefer
 
Oi
s
,
but
 when in 
particular
 
CONTEXT
, 
prefer 
Oj
s
.”
Generally,
 
deny
 calls when 
{busy at work} 
but
allow
 calls from 
{collaborators}.
Scenario-based Preferences:
<Id, Scenario_Conditions, Preferred_Options>
Representation Language/Process
(Study Assistant Example)
27
Separate 
Options
 and 
Scenario
 Language
Options:
 Study at Library, Home, Café
Capture 
Hierarchies
 of 
Scenario-based
Preferences
 
amongst 
the
 Options
<
1, {Homework}, {Home, Cafe}
>
<
2, {Homework, Late}, {Home}
>
<
3, {Homework, Need_Sources}, {Library}
>
Capture 
anti-preferences (
αντενδείξεις
 
or
 contra-
indications
) 
for an 
individual Option
.
<a1, {Closed_Library}, {-Library}>
Refinement
 & 
Combinations
of Scenarios-based Prefs
Refinement
 of Scenarios with 
extra condition(s)
. 
Example 1:
<
1, {Homework}, {Home, Cafe}
>
<
2, {Homework, Late}, {Home}
>
Preferred options (e.g.
 Home) 
in more specific scenario 
win
.
   Therefore arguments in more specific scenario are 
stronger:
Home
 
preferred over 
Café 
(and over 
Library
)
Refinement
 & 
Combinations
of Scenarios-based Prefs
Combination
 of Scenarios with 
conflicting options
Example 2:
<
2, {Homework, Late}, {Home}
>
<
3, {Homework, Need_Sources}, {Library}
>
<
2|3
, 
{Homework, Late, Need_Sources}, 
 
???
>
In 
combined scenarios 
the 
Preferred Options 
are specified
independently
 (or via 
common sense
), e.g.:
{Library}
But {Home, Library} is also possible, i.e. 
no preference/do not know/have not
learned this yet!
Exercise
30
Consider your 
own Personal 
Study
Assistant
Assistant needs to figure out where we will
be studying/working today!
Express 
your preferences 
amongst the
three options of Library, Café, Home in
the form of Scenario-based Preferences.
 
Slide Note
Embed
Share

Exploring the concept of argumentation and its realization in artificial intelligence master programmes, focusing on the construction, evaluation, and implementation of arguments through cognitive programming. The framework involves structured argumentation, conflict relations, and strength/preference considerations, emphasizing the importance of constructing arguments with counter-arguments for defense. Argument schemes, premises, positions, and instantiations play a key role in developing human-centric AI systems.

  • Artificial Intelligence
  • Cognitive Programming
  • Argumentation Framework
  • Human-Centric AI
  • Master Programmes

Uploaded on Sep 10, 2024 | 1 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Master programmes in Artificial Intelligence 4 Careers in Europe University of Cyprus COGNITIVE PROGRAMMING FOR HUMAN-CENTRIC AI Antonis Kakas Autumn 2022

  2. Master programmes in Artificial Intelligence 4 Careers in Europe Lecture 1 Structured Argumentation 1. Realizations of Computational Argumentation 2

  3. Reminder Argumentation Process <Args, ATT> or <Args, Att, Def> Step 1: Construction of Arguments I.e. Construction of Args Step 2: Evaluation of Arguments Acceptability/Validity of argument sets. 3

  4. Construction of Arguments What is an argument? An argument is a LINK between two pieces of information: premises and position (or claim) of the argument. a1=(bird; fly) A Link, not a Rule! 4

  5. Construction of Arguments Arguments are constructed as instantiations of argument schemes As=(Premises; Position) Argument Schemes are programmed or learned from data analysis or experience 5

  6. Realization of Argumentation <Args, ATT> OR <Args, Att, Def> A realization or a structured argumentation framework of an argumentation framework is: <AS, Cf, St> AS is a set of argument schemes Cf is a conflict relation on the statements St is a strength/preference relation on AS 6

  7. Realization of Argumentation <As, C, > ( = St) As is a set of argument schemes C is a conflict relation (in the language) is a binary strength relation on As

  8. Realization of Argumentation <As, C, > As - construct arguments C - specify counter-arguments - used for arguments to defend themselves

  9. Realization of Argumentation Given <AS, Cf, St> we construct/realize an Arg. Framework: <Args, ATT> or <Args, Att, Def> Args are instantiations of elements of AS a1 attacks a2 , i.e. (a1,a2) Att, if they are in conflict according to Cf. a1 defends against a2 , i.e. (a1,a2) Def if a1 is not weaker than a2 under St. In this case, also (a1,a2) ATT 9

  10. Realization of Argumentation From the philosophical roots of argumentation. Given <AS, Cf, St>then a1 attacks a2 : a1, a2 are in conflict under Cf and named: Rebuttal if conflicting positions of a1 and a2. Undermine if a1 conflicts the premises of a2. Undercut if conflict between the argument schemes of a1 and a2. 10

  11. Example of Realizing Argumentation (See earlier lecture) The power cut had turned the house into darkness. Bob came home and turned on the light switch. Args ={a1,a2,a3} constructed by common sense schemes: a1={turn_on_switch causes light_on, light_on causes darkness} U {turn_on_switch@T} a2={power_cut causes electricity, electricity implies light_on} U {power_cut@T} a3={darkness@T implies darkness@T+} U {darkness@T} Argument schemes here are given names: causes and implies a1 supports darkness@T+; a3 supports darkness@T+

  12. Another Example (from Cognitive Science) Byrne s (1989) Suppression Task 12

  13. Suppression Task (Bryne, 1989) The factual information given along with the conditional(s) in each of the groups can change: She has an essay to finish She does not have an essay to finish She has studied late in the library She did not study late in the library

  14. Byrnes (1989) Suppression Task: She has an essay to finish If she has an essay to finish, then she will study late in the library She has an essay to finish What follows? 1. She will study late in the library 2. She will not study late in the library 3. She may or may not study late in the library 96% Modus Ponens/ Deduction

  15. Byrnes (1989) Suppression Task: She has an essay to finish If she has an essay to finish, then she will study late in the library If she has a textbook to read, then she will study late in the library She has an essay to finish What follows? 1. She will study late in the library 2. She will not study late in the library 3. She may or may not study late in the library 96% Modus Ponens/ Deduction is not affected.

  16. Byrnes (1989) Suppression Task: She has an essay to finish If she has an essay to finish, then she will study late in the library If the library is open, then she will study late in the library She has an essay to finish What follows? 1. She will study late in the library 2. She will not study late in the library 3. She may or may not study late in the library

  17. Byrnes (1989) Suppression Task: She has an essay to finish If she has an essay to finish, then she will study late in the library If the library is open, then she will study late in the library She has an essay to finish What follows? 1. She will study late in the library 2. She will not study late in the library 3. She may or may not study late in the library 38% Humans seem to suppress previously drawn information. They reason non-monotonically!

  18. Byrnes (1989) Suppression Task in Argumentation FORMALIZTION OF THE HUMAN REASONING IN ARGUMENTATION GROUP 1: If she has an essay to finish, then she will study late in the library She has an essay to finish a1: HasEssay StudyLibrary a1 supports StudyLibrary (when given has an essay) a1

  19. Byrnes (1989) Suppression Task in Argumentation FORMALIZTION OF THE HUMAN REASONING IN ARGUMENTATION GROUP 2: If she has an essay to finish, then she will study late in the library If she has a textbook to read, then she will study late in the library She has an essay to finish a1: HasEssay StudyLibrary a2: HasTextBook StudyLibrary h_a3: {} HasTextBook a1 supports StudyLibrary a2 does not support its possible claim a2 = {a2,h_a3} supports StudyLibrary a1 a2 But no attacks (no conflicts)!

  20. Byrnes (1989) Suppression Task in Argumentation FORMALIZTION OF THE HUMAN REASONING IN ARGUMENTATION GROUP 3: If she has an essay to finish, then she will study late in the library If the library is open, then she will study late in the library She has an essay to finish a1: HasEssay StudyLibrary a2: OpenLibrary StudyLibrary a3: not OpenLibrary not StudyLibrary h_a4: {} not OpenLibrary a5= {h_a4, a3} acceptable argument supportsnot StudyLibrary a5 attacks a1 but not vice versa! h_a6:{} OpenLibrary {a1, h_a6} acceptable argument for StudyLibrary a6 a1 a5

  21. NL Comprehension Text (Story) Comprehension http://cognition-srv1.ouc.ac.cy/~adamos.koumis/star.html http://cognition-srv1.ouc.ac.cy/~adamos.koumis/index.html 21

  22. PART 3 COMPUTATIONAL ARGUMENTATION in PRACTICE 22

  23. Applications as Argumentation based Decision Making Decision of O (or Derive Conclusion ): Argument for O (or ) No argument for another O (or ) Through Good Quality arguments, i.e.: Acceptable arguments 23

  24. Practical Application of Argumentation Populate a Realization <AS, C, St> Argument/Knowledge engineering/acquisition Consider computational heuristics in the dialectic argumentation process Cognitively based (sometimes) 24

  25. Populate <AS, C, St> The challenge is to capture: Contextual Strength/Preference relation St St is not global Context dependent Hence we need to decide on the strength while deciding on the Option to choose! Two intertwined decisions Arguing about Options reduces to arguing about the strength of arguments supporting the Options 25

  26. Decision Making in Argumentation Knowledge (SBPs) for Decision Making General, Cognitive Form of Knowledge: Generally, in SITUATIONpreferOis, but when in particularCONTEXT, prefer Ojs. Generally, deny calls when {busy at work} but allow calls from {collaborators}. Scenario-based Preferences: <Id, Scenario_Conditions, Preferred_Options>

  27. Representation Language/Process (Study Assistant Example) Separate Options and Scenario Language Options: Study at Library, Home, Caf Capture Hierarchies of Scenario-based Preferences amongst the Options <1, {Homework}, {Home, Cafe}> <2, {Homework, Late}, {Home}> <3, {Homework, Need_Sources}, {Library}> Capture anti-preferences ( or contra- indications) for an individual Option. <a1, {Closed_Library}, {-Library}> 27

  28. Refinement & Combinations of Scenarios-based Prefs Refinement of Scenarios with extra condition(s). Example 1: <1, {Homework}, {Home, Cafe}> <2, {Homework, Late}, {Home}> Preferred options (e.g. Home) in more specific scenario win. Therefore arguments in more specific scenario are stronger: Home preferred over Caf (and over Library)

  29. Refinement & Combinations of Scenarios-based Prefs Combination of Scenarios with conflicting options Example 2: <2, {Homework, Late}, {Home}> <3, {Homework, Need_Sources}, {Library}> <2|3, {Homework, Late, Need_Sources}, ???> In combined scenarios the Preferred Options are specified independently (or via common sense), e.g.: {Library} But {Home, Library} is also possible, i.e. no preference/do not know/have not learned this yet!

  30. Exercise Consider your own Personal Study Assistant Assistant needs to figure out where we will be studying/working today! Express your preferences amongst the three options of Library, Caf , Home in the form of Scenario-based Preferences. 30

  31. Master programmes in Artificial Intelligence 4 Careers in Europe This Master is run under the context of Action No 2020-EU-IA-0087, co-financed by the EU CEF Telecom under GA nr. INEA/CEF/ICT/A2020/2267423

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#