Generative AI and Language Understanding

 
Generative AI and Language
Understanding: Part 4
 
Sandiway Fong
University of Arizona
This is the era of Generative AI
https://chat.openai.com/chat
https://bard.google.com
Tough movement
 
There's a difference between the following two sentences
with respect to the interpretation of arguments:
John is easy to please
John is eager to please
Do you see it?
Hint: what are the arguments of the predicate 
please
Tough movement
 
Let's test ChatGPT on this. 
https://chat.openai.com/chat
We'll return to test DNN-based parsers later.
Google Bard
win-win!
Tough movement
 
Google Bard
Tough movement
 
Google Bard
Tough movement
A bit
suspicious:
why cite 1
st
sentence
?
 
Google Bard
 
Tough movement
 
No prior context.
 
Google Bard
Tough movement
 
(Roberts 2019) p17:
CP follows the head A (rather than preceding it, as in a head-final language);
English has infinitives, and indeed infinitives of this type;
arbitrary null pronouns can appear in this context with the properties that we observe them to have;
the trace is a 
wh-
trace (in many languages, including all the Romance languages, this construction
features an A-dependency),
etc.
 
easy
 (different from 
eager
) has no external argument, e.g. 
It is easy to please John
, (*
It is eager to please John
)
 
On Wh-movement
 (Chomsky, 1977)
 
On Wh-movement
 (Chomsky, 1977)
 
On Wh-movement
 (Chomsky, 1977)
 
Berkeley Neural Parser
 
https://parser.kitaev.io
Google Natural Language
 
subj_xcomp
 
Representation has a missing dependency
some dependencies are not explicitly computed, e.g. 
xcomp
Parse is
wrong
anyway:
see why
?
Slide Note
Embed
Share

In this era of Generative AI, exploring the nuances of tough movement in language understanding. Analyzing the difference between "easy" and "eager" in sentences, highlighting syntactic structures and arguments. Testing ChatGPT and discussing DNN-based parsers. Examining CP structures and infinitives in English, shedding light on external arguments in context. Suspicious citations and discussions on prior context. (479 characters)

  • Generative AI
  • Language
  • Syntax
  • Linguistics
  • ChatGPT

Uploaded on Feb 26, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Generative AI and Language Understanding: Part 4 Sandiway Fong University of Arizona

  2. This is the era of Generative AI https://bard.google.com https://chat.openai.com/chat

  3. Tough movement There's a difference between the following two sentences with respect to the interpretation of arguments: John is easy to please John is eager to please Do you see it? Hint: what are the arguments of the predicate please

  4. Tough movement Let's test ChatGPT on this. https://chat.openai.com/chat We'll return to test DNN-based parsers later.

  5. Google Bard win-win!

  6. Tough movement

  7. Google Bard

  8. Tough movement

  9. Google Bard

  10. Tough movement A bit suspicious: why cite 1st sentence?

  11. Google Bard

  12. Tough movement No prior context.

  13. Google Bard

  14. Tough movement (Roberts 2019) p17: CP follows the head A (rather than preceding it, as in a head-final language); English has infinitives, and indeed infinitives of this type; arbitrary null pronouns can appear in this context with the properties that we observe them to have; the trace is a wh-trace (in many languages, including all the Romance languages, this construction features an A-dependency), etc. easy (different from eager) has no external argument, e.g. It is easy to please John, (*It is eager to please John)

  15. On Wh-movement (Chomsky, 1977)

  16. On Wh-movement (Chomsky, 1977)

  17. On Wh-movement (Chomsky, 1977)

  18. Berkeley Neural Parser https://parser.kitaev.io

  19. Google Natural Language Representation has a missing dependency some dependencies are not explicitly computed, e.g. xcomp Parse is wrong anyway: see why? subj_xcomp

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#