Wednesday, December 21, 2016

facebook data analysis using R programming in R studio


Hi,

Welcome to every one now i am going to show you how to access the Facebook data and analyze your posted comments. It can be done with R programming and Facebook Graph API Explorer & R!. You need to download the latest version of R studio from the following Link(Rstudio). 
 
Now let's begin.
Facebook developer account required to get started with this Facebook Graph API .
If you don’t have Facebook developer account, you can upgrade your personal Facebook account to a Facebook Developer account from here this link.
After registering as Facebook Developer, go to “Tools & Support”->”Graph API Explorer”
To explore Graph API – Token & Permissions are required, so just click on the “Get Token”
then you will see the below image.


 As public profiles included by default in permissions, just click on “Get Access Token”(you can see that red box in above image)
Then it will open new window like below and select all options and click on GetAceess button.




After that GetAccessToken will be generated  (see in below image)



 Now we have token,  let’s explore now.

 Extracting Comments from the Public Facebook Post.

First thing you required is the Post Id of the post. See below steps to get the post Id.

 Suppose below is the post, we want to analyze. Click on the Post Date Time. See below highlighted box.



Copy below Id. This is the post Id.

 


Go to the Graph Explorer.
Type “Post_id/comments” in below box & click on Submit then can able to see the all comments of your post in the form JSON.Now you should click Getcode option it is present at bottom that page.



Click on “Get Code” to get the cURL code. Copy this URL, we will use this URL in R.
  • install.packages(“RCurl “): It allows us to compose general HTTP requests and provides convenient functions to fetch data.
  • install.packages(“rjson”): It allows us to converts JSON object into R objects and vice-versa.
  • install.packages(“tm”):  A Mining Package for text mining applications within R. It offers a number of transformations that ease the tedium of cleaning data.
  •  NLP
  • slam
  • bitops
  • RJSONIO
  • wordcloud
  • RColorbrewer
  • tmap
  • LearnBayes
  • RXKCD
 R Commands:


Type the following commands in new script.


library("RCurl")
library("NLP")
library("slam")
library("tm")
library("bitops")
library("RJSONIO")
library("wordcloud")
library("RColorBrewer")
library("tmap")
library("LearnBayes")
library("RXKCD")
url<- "https://graph.facebook.com/v2.8/378738755798904/comments?access_token=EAACEdEose0cBAMDNBeBPnHayQJYoHwajCqzX8G20jqxGUZ

Cq085T6yqDZAekeHZB2FlL4qBBAKKn5dENi98Iz4a1uOy3RL72TMIVYb6nisc7mmntvWh
9FfBOHo86IkTqWUocBByEiGfarmS1CexnfFgcAZArCvYskpVoWuTsRwZDZD"
d<- getURL(url)
j<- fromJSON(d)
comments<- sapply(j$data, function(j) {list(comment=j$message)})
comments


 url used in above code is copied from cURL code from Graph Explorer.
 Then you will get the all comments of your post.




Cleaning & Analyzing Data:

Creating corpus & removing extra spaces, special characters & other unwanted things.Now let's type the below commands in script.


Cleanedcomments<- sapply(comments, function(x) iconv(enc2utf8(x),sub = "byte"))
my_corpus <- Corpus(VectorSource(Cleanedcomments))
my_function<- content_transformer(function (x, pattern ) gsub(pattern, "", x))
my_cleaned_corups <- tm_map(my_corpus, my_function, "/")
my_cleaned_corups <- tm_map(my_cleaned_corups, my_function, "@")
my_cleaned_corups <- tm_map(my_cleaned_corups, my_function, "\\|")
my_cleaned_corups <- tm_map(my_cleaned_corups, content_transformer(tolower))
my_cleaned_corups <- tm_map(my_cleaned_corups, removeWords, c(stopwords("english"), "wwwmkgdroidblogspot", "wwwmkgdroidblogsp", "httpmkgdroidblogspotin201611viper4androidletvle2max2htmlm1" ))
my_cleaned_corups <- tm_map(my_cleaned_corups, removePunctuation)
my_cleaned_corups <- tm_map(my_cleaned_corups, stripWhitespace)


  In above green highlighted words are removable words from all comments of post.
You can mention what words you want to eliminate from your comments.

Creating Term Document Matrix:

my_tdm<- TermDocumentMatrix(my_cleaned_corups)
m<- as.matrix(my_tdm)
View(m)
words <- sort(rowSums(m), decreasing = TRUE)
my_data <- data.frame(word = names(words), freq=words)
View(my_data)


Here is the all extracted words with frequency like below.

  

Creating Wordcloud:  
Type this  commnd in script 

wordcloud(words = my_data$word, freq = my_data$freq, min.freq = 2, max.words = 100, random.order = FALSE, rot.per = 0.5, colors = brewer.pal(1, "Dark2"))

In this Word Cloud we are taking only 100 words with minimum frequency of 2.
You can see the graph like below.





Thank You all for learning this. If you have any doubts please post your comments in comment section.I will give the solutions as much as early.
 Hope you wil enjoy this.
 

11 comments:

Trienviro 360 Business Solution Pvt. Ltd said...

Because of this, it truly is superior you can applicable research before providing. It will be possible to share larger documents that way.Best SEO Company in Kanpur India

Textool said...

I am thankful to this blog for assisting me. I added some specified clues which are really important for me to use them in my writing skill. Really helpful stuff made by this blog. Word frequency

Vidhyamenon said...

Great Blog with good information.

R Programming Training in Chennai
R Programming Training in Bangalore

jeya sofia said...

Thank you for the useful information. Share more.
R Programming
Learn R

Roy rinjo said...

I read this blog, a Nice article...Thanks for sharing and waiting for the next...
Data Science Online Course
Best Online Data Science Courses

Trienviro 360 Business Solution Pvt. Ltd said...

Hello! This post couldn’t be written any better! Reading this post reminds me of my previous roommate! He always kept chatting about this. I will forward this page to him. Fairly certain he will have a good read. Thank you for sharing!Website Design and Development Company

Rohan Kumar said...

Very useful Post. I found so many interesting stuff in your Blog especially its discussion. Hire Android developer to build a robust and scalable app suited to your business requirements at the best price.

hire android app developer

Aaron jhonson said...

Thanks for sharing such an informative Article. I really Enjoyed. It was great reading this article. Keep posting more articles on
Big Data Solutions 
Advanced Data Analytics Services

Muzhumathi said...

Great blog with good information.

R Training in Chennai
R Programming Training in Bangalore

Muzhumathi said...

Great blog, it is very impressive.

How R Programming is used in Data Science
Importance of R Programming

Blogging 2.0 said...

Thanks for the blog post buddy! Keep them coming...
saas development company

Send free SMS Text Messages with Python

Short Message Service (SMS) text messages are ubiquitous for communication all over the world. It is easy to send SMS text messages fro...

Data Science

WEB Development