Sarcasm means expressing our feelings in
opposite of what we actual feel. It can also be defined as a satirical wit
intending to insult, mock, or amuse but it is to be removed during natural
language processing. In the usage of Twitter, we observed that many sarcastic
tweets have a common structure that creates a positive/negative contrast
between a sentiment and a situation. Specifically, sarcastic tweets often
express a positive sentiment in reference to a negative activity or state. Let’s
consider the tweets below, where the positive sentiment terms are underlined
and the negative activity/state terms are italicized.
(a) Wow! I feel
happy when he denied my payment.
(b) Oh how I love
being ignored.
(c) Absolutely
adore it when my bus is late.
(d) I’m so pleased
mom woke me up with vacuuming my room this morning.
|
The sarcasm in
these tweets arises when a positive sentiment word (e.g., love, adore, pleased)
with a negative activity (e.g., denied my payment, being ignored, bus is late,
denied my payment).
The goal is to
identify sarcasm that arises from the differences between positive sentiments
refers to a negative situation. A key issue is to automatically recognize the monotonous
negative “situations”, that are activities, states that most people feel sorry
to be unenjoyable or undesirable. These situations are recognized as being
negative, so they are rarely accompanied by a particular negative feeling. For
example, “I feel sick” is globally understood to be a negative
situation. So such recognized phrases that correspond to negative situations
must be learnt.
Bootstrapping Algorithm
A bootstrapping
algorithm proceeds to learn terms or phrases of positive sentiments and
negative situations automatically. The
aim of algorithm is to generate a sarcasm classifier for tweets to recognize
contexts having a positive sentiment contrasted with a negative situation.
Learning Negative Situation Phrases
The initial
phase of bootstrapping method learns new phrases that correspond to negative
situations. The learning process consists of two steps: (1) cropping candidate phrases
and (2) selecting the suitable candidates. We can collect the phrases for negative
situations and extract N-grams that follow a positive sentiment phrase in a
sarcastic tweet. We pick every one gram, two gram and three gram that occurs
immediately at the right side of a positive sentiment phrase.
I am very happy when he denied my
payment # sarcasm
|
In the above
statement, where “happy” is the positive sentiment: I am very happy when he denied my payment # sarcasm. In this example,
we extract three N-grams for candidate negative situation phrases can be
extracted such as happy, very happy, very happy when. Then based on the part-of-speech (POS),
filter the list to keep N-grams for the intended syntactic structure.
For negative situation, the goal is to learn the verb phrase (VP)
complements that are themselves verb phrases. So we require a candidate phrase
to be either a one-gram as a verb (V) or the phrase matches the one of 9
POS-based bigram patterns that is created to try to approximate the recognition
of verbal complement structures.
Learning of Positive Verb Phrases
Learning
positive sentiment phrases is comparable in certain aspects. First, collection
phrases that are capable to convey a positive sentiment by obtaining N-grams
that come before a negative situation phrase in a sarcastic tweet. To learn
positive sentiment verb phrases, we pick every One-gram and Two-gram that
occurs immediately before on the left side of a negative situation phrase.
Learning Positive Predicative Phrases
The negative
situation phrases are used to eliminate predicative expression that occur close
proximity. Based on the same assumption that sarcasm often claims from the difference
between positive sentiments and a negative situation and tweets are targeted
that has a negative situation and a predicative expression nearby. Assuming
that, the predicative expression conveys a positive sentiment. We pick positive
sentiment candidates by extracting one-grams, two-grams and three-grams that
appear immediately after a verb and occur within five words of the negative
situation phrase, on either side. This restriction only compels proximity
because predicative expressions often appear in a separate clause or sentence. For
example, “It is just great that my data was
stolen” or “My data was stolen. This
is great.”
References
1. Henry S. Cheang
and Marc D. Pell. 2009. Acoustic markers of sarcasm in cantonese and english.
The Journal of the Acoustical Society of America, 126(3):1394–1405.
2. Dmitry Davidov,
Oren Tsur, and Ari Rappoport. 2010. Semi-supervised recognition of sarcastic
sentences in twitter and amazon. In Proceedings of the Fourteenth Conference on
Computational Natural Language Learning, CoNLL 2010.
3. Sarcasm as
Contrast between a Positive Sentiment and Negative Situation Ellen Riloff,
Ashequl Qadir, Prafulla Surve, Lalindra De Silva, Nathan Gilbert, Ruihong Huang
4. Roger Kreuz and
Gina Caucci. 2007. Lexical influences on the perception of sarcasm. In
Proceedings of the Workshop on Computational Approaches to Figurative Language.
5. Christine
Liebrecht, Florian Kunneman, and Antal Van den Bosch. 2013. The perfect
solution for detecting sarcasm in tweets #not. In Proceedings of the 4th
Workshop on
Computational Approaches to Subjectivity, Sentiment and Social Media Analysis,
WASSA 2013.
Comments
Post a Comment