Compare commits

...

193 Commits

Author SHA1 Message Date
ac5061e40d Improving Nitter instances view. 2024-02-10 13:05:47 -05:00
abc2752292 Starting NitterInstance modelling. 2024-02-10 12:34:50 -05:00
27a8a66dd3 Exporting history of instances report. 2024-02-04 19:25:18 -05:00
8843a31d5d Default view of instances for the Nitter class. 2024-02-04 18:54:11 -05:00
d6824d0c97 Querying Nitter instances. 2024-02-04 18:15:29 -05:00
ae92892504 Storing url in DB. 2023-07-27 17:57:54 -05:00
fd1de5971b Storing dates in DB. 2023-07-27 18:00:25 -05:00
03355de4f4 Improving tweets view and user profile image methods. 2023-07-27 16:13:05 -05:00
6e69fa5b3a Fixing methods for peding information. 2023-07-27 12:41:04 -05:00
635dc71ab5 Merge 6d5caf6d6d 2023-07-27 11:21:36 -05:00
57186f444f Refactoring messages. 2023-07-27 11:20:19 -05:00
6d5caf6d6d Fixing address in install instructions. 2023-07-27 09:55:42 -05:00
43662d3fce Refactoring for improving display times. 2023-07-26 20:52:32 -05:00
7d77d26691 Editing tweets preview with limit of tweets and html report file metod. 2023-07-26 11:46:40 -05:00
5336e8d224 Fixing wordcloud data language for report exporter. 2023-07-26 10:04:08 -05:00
69375d2683 Fixing quotes report label for empty quotes. 2023-07-25 21:36:11 -05:00
3148f61eb9 Fixing messages processing for asDictionary and web report exporter. 2023-07-25 21:30:24 -05:00
625e66f006 Creating pdf report button. 2023-07-25 20:24:52 -05:00
f82728d159 Fixing dependencies declaration. 2023-07-25 17:22:55 -05:00
5c82dd485a Fixing dependencies declaration. 2023-07-25 17:16:18 -05:00
9348faaa91 Adding minidocs to baseline. 2023-07-25 17:12:43 -05:00
1c551148fa Redifining dependencies. 2023-07-25 17:07:09 -05:00
4c8d269712 Improving external word cloud for language selection. 2023-05-18 12:27:02 -05:00
7b7fbceedc Modifying nitter provider and improving created format parse for user. 2023-05-18 11:29:19 -05:00
6b70551210 Creating disposable replies and tweets parser from loaded image for nitter user. 2022-09-21 16:48:44 -05:00
ff457ddbe7 Redefining dependencies. 2022-07-24 15:49:50 -05:00
ff72e94bb5 Extra initializer to support new data narratives. 2022-07-15 11:13:03 -05:00
7bb4d66d57 New instance creation. 2022-07-04 18:17:13 -05:00
482834e205 Merge fa694f99a7 2022-06-22 17:04:05 -05:00
7c546c32b8 Fixing private tiddler 20220622164935351. 2022-06-22 17:01:03 -05:00
fa694f99a7 Refining dependencies. 2022-06-22 16:47:44 -05:00
485572a81e Refactoring quotes and retweets histograms. 2022-06-21 13:20:46 -05:00
ff38d34dc4 Refactoring histograms data calculation and exporters. 2022-06-21 11:59:13 -05:00
4d439283b1 Creating tweets collectioin splits by number and Refactoring Timespan extensions. 2022-06-21 08:40:44 -05:00
7da6f87daf Refactoring tweets collections time spans. 2022-06-20 20:33:53 -05:00
2cec4816a3 Creating tweets collection split by weeks. 2022-06-20 20:27:47 -05:00
c91c793683 Merge 29b758c58a 2022-06-17 18:26:20 -05:00
0af6e835b2 Modifying tweets collection spliting. 2022-06-17 18:11:07 -05:00
29b758c58a Modurizing file exporters. 2022-06-17 18:10:48 -05:00
fdeecc5fde Creating extension for roassal for exporting histograms. 2022-06-17 17:27:00 -05:00
215e83d66e Creating tweets collections split by days. 2022-06-16 20:44:15 -05:00
6548c70291 Merge 6f821dfe53 2022-06-15 18:46:53 -05:00
61eb15f8e3 Creating author ids to get all the ids from database and refactoring tweets filtering. 2022-06-15 18:43:59 -05:00
6f821dfe53 Merge 0b172e6f29 2022-06-15 18:29:26 -05:00
5333b1c8d8 In reality a single user identity can change of IDs and or @names. This change prepares the changes to model that. 2022-06-15 18:18:59 -05:00
0b172e6f29 Modyfing tweets for filtering user names with id changed. 2022-06-15 16:30:09 -05:00
d4432bae7a Improving required folders detection for exporting visualizations. 2022-06-09 20:05:03 -05:00
fb00701e9e Fixing messages and Improving remote messages from html. 2022-06-09 18:07:22 -05:00
d2025d8aa9 Fixing quotes report data and Modifying retweets report data. 2022-06-09 16:05:38 -05:00
1dec7081e4 Modifying quotes report data. 2022-06-09 16:02:35 -05:00
33dccbc851 Creating replies by time span. 2022-06-09 15:53:58 -05:00
c8e7509e6a Creating nitter user as dictionary for web report and twwets by time span. 2022-06-09 15:50:26 -05:00
40d47caf1c Modifying get messages method. 2022-06-09 15:20:02 -05:00
7ddbed96bc Modifying templates test and installation 2022-06-09 15:18:07 -05:00
3d7cc8110c Modifying messages decalration for twitter and nitter user. 2022-06-09 15:12:36 -05:00
68333ff0c0 Creating templates testing and Modifying exporters 2022-06-09 14:14:37 -05:00
dcb098e0c7 Fixing bug at install commons. 2022-06-09 13:51:20 -05:00
722b2d2066 Creating word cloud data by language and a BUG at install commons. 2022-06-09 13:43:48 -05:00
07516e4dc6 Creating get remote messages from html and Modifying nitter provider. 2022-06-09 13:10:25 -05:00
773d94383c Modifyng wordcloud data filtering. 2022-06-09 12:29:38 -05:00
c535993e47 Modifyng nitter user as dictionary for worcloud data exporter. 2022-06-09 12:15:48 -05:00
0577beb724 Creating a word cloud data for web static exporter. 2022-06-09 11:53:27 -05:00
dc446f2026 Modifying socialmetrica baseline. 2022-06-09 10:18:49 -05:00
8d9a66e012 Creating week report contract and Modifying tweets histogram for empty tweets. 2022-06-08 17:26:29 -05:00
8f3c63dc71 Modifying tweets parser due to a change in twitter ids. 2022-06-08 16:40:43 -05:00
94b104e0e8 Creating retweets static web report histogram. 2022-06-04 17:49:18 -05:00
91744733bd Creating replies and quotes static web report histogram. 2022-06-03 19:43:01 -05:00
c713ad9d28 Modifying asDictionary for exporting as static web report. 2022-06-01 23:32:31 -05:00
3ad7148e62 Creating export as static web report. 2022-06-01 15:52:32 -05:00
5fdad7fa69 Modifying nitter user parse as dictionary and as tiddler. 2022-05-31 13:08:05 -05:00
cb8bdbbaee Creating nitter user parsing as tiddler. 2022-05-25 16:23:15 -05:00
e9d8745818 Modifying baseline of socialmetrica for TiddlyWikiPharo dependency. 2022-05-25 12:45:25 -05:00
33805fd903 Modifying export overview LaTex with automated weeks. 2022-05-25 12:31:53 -05:00
9544a92392 Modifying export overview LaTex and nitter user id for deleted users. 2022-05-25 11:50:30 -05:00
7a7058f30c Merge 71484b16b0 2022-05-21 22:17:47 -05:00
cee2682e7a Modifying tweets filtering for id change. 2022-05-21 22:15:14 -05:00
71484b16b0 Not found user: starting implementation. 2022-05-21 21:45:14 -05:00
f9b7fdf6bb Modifying newest and oldest tweet selection. 2022-05-15 17:15:12 -05:00
75290c2538 Merge 7fa248e4fa 2022-05-15 16:32:33 -05:00
bc6a4fb9eb Tweets scraping: updating from web until the most recently stored. 2022-05-15 16:32:06 -05:00
7fa248e4fa Merge 7717f00185 2022-05-15 13:58:56 -05:00
f467129c42 Fixing tweets by date filtering and replies. 2022-05-15 13:56:39 -05:00
7717f00185 Tweets scraping: updating from web until the most recently stored. 2022-05-15 13:51:33 -05:00
9ae819722f Merge 463781085b 2022-05-15 12:40:08 -05:00
7c6d7d9d80 Improving tweets scraping. 2022-05-15 12:39:10 -05:00
463781085b Fixing tweets filtering and write words for external cloud. 2022-05-15 12:31:23 -05:00
a515a24a81 Refactoring tweets scrapping: getting oldest ones. 2022-05-15 08:50:22 -05:00
a61de2ecb7 Refactoring tweets scrapping. 2022-05-15 08:23:21 -05:00
742bb97446 New dependencies: Mustache. 2022-05-14 20:31:16 -05:00
bba0f9088d Debugging dependencies. 2022-05-14 20:09:38 -05:00
ac5d935ce1 Fixing messages for nitter user and tweets filtering. 2022-05-14 19:39:21 -05:00
0f62345220 Merge 33013dc52f 2022-05-14 18:25:05 -05:00
624198091e Modifying baseline of socialmetrica. 2022-05-14 18:24:56 -05:00
33013dc52f Merge ac627f756c 2022-05-14 18:18:02 -05:00
390a6c3ef1 Bugfix: Avatar picture. 2022-05-14 18:17:34 -05:00
ac627f756c Merge fa1cf4a6d8 2022-05-14 18:07:28 -05:00
f898af0c05 Modifying baseline exception on socialmetrica. 2022-05-14 18:07:13 -05:00
fa1cf4a6d8 Merge 7d6c8259f6 2022-05-14 17:46:54 -05:00
c16f109067 Bugfix: Tweets and avatar 2022-05-14 17:45:58 -05:00
7d6c8259f6 Merge 49f44385d7 2022-05-14 16:56:20 -05:00
defb1a8819 Modifyng installer for commons. 2022-05-14 16:56:07 -05:00
49f44385d7 Fixing installation requirements and adding new ones. 2022-05-14 16:39:23 -05:00
c32a351932 Dealing with changes in user Id. 2022-05-04 22:43:23 -05:00
459502cd5f Implementing reporting periods to limit messages processing inside a predefined timespan. 2022-05-04 07:27:25 -05:00
4abb9fe987 First draft to limit the reporting period. Future ones should have a TweetsCollection property to define such limits. 2022-05-03 23:08:54 -05:00
a2d5d80e57 Modifying newest tweet selection. 2022-05-03 21:25:22 -05:00
a08073232d Modifying newest tweet selection. 2022-05-03 20:50:20 -05:00
35d9d037d3 Modifying oldest tweet selection and oldest tweet for nitter user. 2022-05-03 12:32:05 -05:00
ae652fdf3a Modifying bars, quotes and replies histograms. 2022-05-03 11:05:56 -05:00
21b4f432c4 Temporal bugfix to deal with Twitter changes in user id. 2022-05-03 08:32:10 -05:00
419aa176ab Modifying date displays in histograms and fixing empty replies and quotes exporter. 2022-05-02 22:31:41 -05:00
6a628fddd3 Modyfing nitter user created at parsing. 2022-05-02 20:15:19 -05:00
aa5cf9cbba Modyfing nitter user sanitization for tex exporter. 2022-04-29 17:58:03 -05:00
104fe9e348 Fixing method for geting oldest tweets, modifying store avoiding duplication. 2022-04-28 16:17:56 -05:00
06b0d5748d Creating download tweets from oldest up to a page, modifying histograms exporters and external word cloud without mask. 2022-04-27 23:51:53 -05:00
ac963c753f Minor change in oldest tweet selection. 2022-04-27 15:51:40 -05:00
d2d0f739a8 Modifying nitter user for mustache template. 2022-04-27 15:38:10 -05:00
ad26072420 Fixing replies filtering and modifying tweets collection messages. 2022-04-26 14:05:58 -05:00
bd69b73529 Fixing raw tweets collection and tweets filtering from html. 2022-04-25 18:59:10 -05:00
b964460fa3 Fixing page contents downloader and modifying tweets collection. 2022-04-25 18:21:13 -05:00
a38124638d Creating tweet testing, and modifyng nitter user asdictonary. 2022-04-25 17:12:04 -05:00
3ff8dfc86a Code review: improved modularity. A second pass on #downloadWithRenaming: still pending. 2022-04-25 09:13:04 -05:00
bd279b1fd3 Code Review: Formatting and ward clause. Method should be made more modular. 2022-04-25 06:26:54 -05:00
bbfae14949 Creating exporter for report in LaTeX format. 2022-04-21 12:11:53 -05:00
f66d93e2ca Refactoring: minor changes in histogram exporters. 2022-04-20 16:59:57 -05:00
57adfca53a Creating quotes histogram exporter. 2022-04-20 16:44:34 -05:00
66298fc248 Fixing and creating retweets histogram exporters. 2022-04-20 16:27:54 -05:00
cde55509bf Creating retweets and quotes sorted by occurrences. 2022-04-20 15:49:50 -05:00
e840f733fe Tweet user bug fix. 2022-04-20 14:56:08 -05:00
1e06d16801 Removing unnecessary methods. 2022-04-20 14:13:02 -05:00
e43b375a6b Creating exporter for retweets. 2022-04-19 13:55:39 -05:00
e8373fab2c Merge pull request 'detached/2' (#2) from detached/2 into master
Reviewed-on: https://code.tupale.co/Offray/Socialmetrica/pulls/2
2022-04-19 00:23:25 -05:00
4e6b134848 Merge 4ab8c6f0dd 2022-04-19 00:20:42 -05:00
99a03c0ad1 Tweets collection now inherits from OrderedCollection to improve reusability. 2022-04-19 00:19:23 -05:00
4ab8c6f0dd Recommiting changes in ee905 and two latter (why they were reverted?) 2022-04-18 22:36:10 -05:00
b92102a9d1 Merge dc945dd8f9 2022-04-18 22:16:00 -05:00
c2078d032f Minor fix in quotes scraping. 2022-04-18 22:14:17 -05:00
dc945dd8f9 Modifying histograms exporters. 2022-04-18 19:19:49 -05:00
2383ea4459 Refactoring collect tweets. 2022-04-18 19:00:58 -05:00
29c6deb617 Creating quotes filtering for nitter user messages. 2022-04-18 18:52:56 -05:00
3a58d281e2 Merge e98fd45083 2022-04-18 18:47:07 -05:00
a687e1b688 Creating metadata parsing from nitter html. 2022-04-18 18:44:52 -05:00
e98fd45083 Improving creation date processing/transfomation from different formats. #retrieveContents should also load local messages. 2022-04-18 18:36:52 -05:00
906b29a7ef Creating quotes metadata parsing. 2022-04-18 18:36:07 -05:00
8d87ee3ed7 Creating replies and retweets filtering and modifying histograms. 2022-04-18 17:29:09 -05:00
bfa2a2ab99 Modifying documentTree for downloading replies. 2022-04-18 16:14:01 -05:00
ad592beaf8 Creating replies download and recognition from nitter html. 2022-04-18 15:53:57 -05:00
49d5529934 Fixing creation date return depending on type of scrapping. 2022-04-17 12:30:28 -05:00
7de47a4959 Fixing tweets rendeting: loading users from local DB or remote scrapping. 2022-04-16 22:36:46 -05:00
ee9050a9b9 Fixing tweets rendeting by loading local users from DB. 2022-04-16 19:53:33 -05:00
50a3ef901b Code review: More modular code. 2022-04-16 12:08:30 -05:00
5d457e824a Creating histogram exporter for tweets and modifyng install external word cloud commons. 2022-04-16 01:59:52 -05:00
f28e884a79 Creating default exporter for tex template. 2022-04-16 01:37:11 -05:00
c9d9ec9620 Creating installation for template. 2022-04-16 01:32:02 -05:00
1f081f58cf Creating time spans division for tweets collection. 2022-04-15 18:53:06 -05:00
2de7cf1d11 Now external word cloud installs its common files and loads required data. 2022-04-15 14:35:11 -05:00
986f4a83bc Better variable namings 2022-04-15 08:49:20 -05:00
382dcd9954 Calculating time subperiods for tweets histogram. 2022-04-15 07:36:53 -05:00
0da5c06527 Initial methods for tweets histogram. 2022-04-15 00:05:17 -05:00
d61490e96c Making word cloud reproducible in local copy. 2022-04-14 20:27:26 -05:00
216832c3e2 Fixing external word cloud commons installation. 2022-04-14 20:00:06 -05:00
a50e37dce1 Modifying external cloud commons installation. 2022-04-14 19:39:46 -05:00
4341383a8c Modifying external word cloud parameters. 2022-04-14 19:20:45 -05:00
f97b193c5e Renaming methods for clarity. 2022-04-14 17:40:41 -05:00
b9eedc6ec3 Fixing connection to ReStore DB in tweets collection. 2022-04-14 16:43:31 -05:00
02ab2bd74a Local storage: getting messages from local DB. 2022-04-12 21:58:51 -05:00
f563f964b2 Creating testing for twitter user in ReStore. 2022-04-12 18:59:12 -05:00
8f2ed277fb Modifying tweet created for ReStore definition. 2022-04-12 18:23:08 -05:00
a445100e58 Fixing tweet form nitter html and collecting tweets for nitteruser. 2022-04-12 18:13:31 -05:00
bcf67809a8 Improving nitter user getting pages. 2022-04-12 16:05:59 -05:00
2c61055b38 Improved naming to better convey intention and more TO DO's. 2022-04-12 12:03:48 -05:00
52b90dd8dd New TO DO annotations and improving formating. 2022-04-12 10:15:24 -05:00
9b99ed0e33 Now each tweet stores metadata about the Nitter timeline where it appears. 2022-04-11 20:46:22 -05:00
db1a8e0502 Improving nitter user getting and parsing tweets. 2022-04-11 14:18:52 -05:00
bc168c4f0a Code review: Modularization. 2022-04-11 12:11:10 -05:00
65c36e924c Code review. Improving modularization and naming. 2022-04-11 11:42:54 -05:00
89a73603a3 Spliting between local and remote messages. Minor UI improvements. 2022-04-11 10:05:21 -05:00
2c2ee3b132 08.1376 Migration: Tweets previews problems deprecating BlLazyElement by BrAsyncWidget. 2022-04-10 12:37:25 -05:00
94ce588376 08.1376 Migration: Fixing profile previews problems caused by BlLazyElement deprecation. 2022-04-10 11:22:59 -05:00
01a62b2b2a 08.1376 Migration: Fixing profile previews problems caused by BlLazyElement deprecation. 2022-04-09 23:09:23 -05:00
36c261d534 Migrating to 08.1376, fixing code smells and refining UI elements because of external deprecation (but old problems arise). 2022-04-09 19:31:10 -05:00
a84e9504f9 Improving tweets parsing from nitter timeline item. 2022-04-08 23:50:31 -05:00
d9af46ec37 Improving tweet parsing from nitter timeline item. 2022-04-08 22:02:27 -05:00
e8f9e1ddd9 Merge pull request 'detach-recovery' (#1) from detach-recovery into master
Reviewed-on: https://code.tupale.co/Offray/Socialmetrica/pulls/1
2022-04-08 11:24:42 -05:00
907fffe4be Merge 2e6e480668 2022-04-08 11:17:21 -05:00
8c9e302fdd Fixing conflict. 2022-04-08 11:15:04 -05:00
ba05b8d0bb Changing profileImageUrl definition to ease ReStore serialization. Tip: Maybe derived keys can be used to serialize url as Strings in SQLite. 2022-04-08 11:10:24 -05:00
1e4f3d1190 Merge 2e6e480668 2022-04-07 14:20:57 -05:00
b198bde247 Improving urls for load more tweets and removing unnecessary method. ref 5b169c4ebf 2022-04-07 14:17:17 -05:00
2e6e480668 Merge 610c8e4ca1 2022-04-07 13:35:11 -05:00
17b39ec4ad Fixing TwitterUser id detection. 2022-04-07 13:33:46 -05:00
610c8e4ca1 Improving nitter user profile image exporter. 2022-04-07 11:28:18 -05:00
6a60f4f13f Code smell: redundant information of folder name and file name. 2022-04-06 13:24:51 -05:00
179 changed files with 1446 additions and 163 deletions

View File

@ -5,10 +5,13 @@ baseline: spec
for: #common for: #common
do: [ do: [
"Dependencies" "Dependencies"
self xmlParserHTML: spec. "self xmlParserHTML: spec."
"self rssTools: spec." self reStore: spec.
self miniDocs: spec.
"self roassal3Exporters: spec."
"self tiddlyWikiPharo: spec."
"Packages" "Packages"
spec spec
package: 'Socialmetrica' package: 'Socialmetrica'
with: [ spec requires: #('XMLParserHTML' "'RSSTools'") ] with: [ spec requires: #('ReStore' 'MiniDocs' "'XMLParserHTML' 'Roassal3Exporters' 'TiddlyWikiPharo'") ]
] ]

View File

@ -0,0 +1,8 @@
accessing
miniDocs: spec
| repo |
repo := ExoRepo new
repository: 'https://code.sustrato.red/Offray/MiniDocs'.
repo load.
spec baseline: 'MiniDocs' with: [ spec repository: 'gitlocal://', repo local fullName ]

View File

@ -0,0 +1,10 @@
baselines
reStore: spec
Metacello new
repository: 'github://rko281/ReStoreForPharo';
baseline: 'ReStore';
onConflict: [ :ex | ex useLoaded ];
onWarningLog;
load: 'all'.
spec baseline: 'ReStore' with: [ spec repository: 'github://rko281/ReStoreForPharo']

View File

@ -0,0 +1,7 @@
baselines
roassal3Exporters: spec
Metacello new
baseline: 'Roassal3Exporters';
repository: 'github://ObjectProfile/Roassal3Exporters';
load.
spec baseline: 'Roassal3Exporters' with: [ spec repository: 'github://ObjectProfile/Roassal3Exporters']

View File

@ -1,10 +0,0 @@
baselines
rssTools: spec
Metacello new
repository: 'github://brackendev/RSSTools-Pharo:v1.0.1/src';
baseline: 'RSSTools';
onConflict: [ :ex | ex useLoaded ];
onUpgrade: [ :ex | ex useLoaded ];
onDowngrade: [ :ex | ex useLoaded ];
load.
spec baseline: 'RSSTools' with: [ spec repository: 'github://brackendev/RSSTools-Pharo:v1.0.1/src']

View File

@ -0,0 +1,8 @@
baselines
tiddlyWikiPharo: spec
| repo |
repo := ExoRepo new
repository: 'https://code.sustrato.red/Offray/TiddlyWikiPharo'.
repo load.
spec baseline: 'TiddlyWikiPharo' with: [ spec repository: 'gitlocal://', repo local fullName ]

View File

@ -4,8 +4,6 @@ xmlParserHTML: spec
baseline: 'XMLParserHTML'; baseline: 'XMLParserHTML';
repository: 'github://pharo-contributions/XML-XMLParserHTML/src'; repository: 'github://pharo-contributions/XML-XMLParserHTML/src';
onConflict: [ :ex | ex useLoaded ]; onConflict: [ :ex | ex useLoaded ];
onUpgrade: [ :ex | ex useLoaded ];
onDowngrade: [ :ex | ex useLoaded ];
onWarningLog; onWarningLog;
load. load.
spec baseline: 'XMLParserHTML' with: [spec repository: 'github://pharo-contributions/XML-XMLParserHTML/src'] spec baseline: 'XMLParserHTML' with: [spec repository: 'github://pharo-contributions/XML-XMLParserHTML/src']

View File

@ -6,7 +6,7 @@ To install, first install [ExoRepo](https://code.tupale.co/Offray/ExoRepo) and t
```smalltalk ```smalltalk
ExoRepo new ExoRepo new
repository: 'https://code.tupale.co/Offray/Socialmetrica'; repository: 'https://code.sustrato.red/Offray/Socialmetrica';
load. load.
``` ```

View File

@ -0,0 +1,11 @@
*Socialmetrica
periodsSince: startingDate until: endingDate
| borders subperiodDuration |
subperiodDuration := (endingDate - startingDate) / self.
borders := OrderedCollection new.
borders add: startingDate.
1 to: self do: [ :i | | ending |
ending := startingDate + (subperiodDuration * i).
borders add: ending.
].
^ borders

View File

@ -0,0 +1,3 @@
{
"name" : "Integer"
}

View File

@ -0,0 +1,8 @@
accessing
columnsDictionary
^ {
'url' -> 'url' .
'healthy' -> 'healthy'.
'healthy_percentage_overall' -> 'uptime' .
'rss' -> 'rss' .
'version' -> 'version'} asDictionary

View File

@ -0,0 +1,4 @@
accessing
exportInstanceReport
MiniDocs exportAsSton: self instances on: FileLocator temp / 'instances.ston'.
^ FileLocator temp / 'instances.ston'

View File

@ -0,0 +1,8 @@
accessing
instanceRows
^ (self instances at: 'hosts') collect: [:rawRow | | newRow |
newRow := NitterInstance new.
self columnsDictionary keysAndValuesDo: [:key :value |
newRow writeSlotNamed: value value: (rawRow at: key) ].
newRow
].

View File

@ -0,0 +1,3 @@
accessing
instances
^ self instanceRows

View File

@ -0,0 +1,9 @@
accessing
instancesCache
| cache cacheFile |
cacheFile := FileLocator temp / 'nitter-instances.ston'.
cacheFile exists
ifFalse: [
cache := STON fromString: 'https://status.d420.de/api/v1/instances' asUrl retrieveContents.
MarkupFile exportAsFileOn: cacheFile containing: cache ].
^ STON fromString: cacheFile contents

View File

@ -0,0 +1,2 @@
accessing
instancesTable

View File

@ -0,0 +1,21 @@
accessing
viewInstancesFor: aView
<gtView>
| columnedList columnNamesMap |
self instances isEmptyOrNil ifTrue: [ ^ aView empty].
columnedList := aView columnedList
title: 'Instances';
items: [ self instanceRows ];
priority: 1.
columnNamesMap := {
'Instance' -> 'url'.
'Healthy' -> 'healthy'.
'Uptime %' -> 'uptime'.
'RSS' -> 'rss'.
'Nitter Version' -> 'version'} asOrderedDictionary.
columnNamesMap keysAndValuesDo: [:aName :value |
columnedList
column: aName
text: [:instanceRow | (instanceRow readSlotNamed: value) ifNil: [''] ]
].
^ columnedList

View File

@ -0,0 +1,11 @@
{
"commentStamp" : "",
"super" : "Object",
"category" : "Socialmetrica",
"classinstvars" : [ ],
"pools" : [ ],
"classvars" : [ ],
"instvars" : [ ],
"name" : "Nitter",
"type" : "normal"
}

View File

@ -0,0 +1,7 @@
I model a Nitter instance uptime & health tracker as described in https://status.d420.de/about.
Taken from the official documentation, in the previous linkm the fields we are modelling are:
- healthy: stands for hosts which are reachable and pass a content check.
- rss: whether the host has RSS feeds enabled.
- version: which nitter version the host reports.

View File

@ -0,0 +1,3 @@
accessing
hasRSS
^ self rss

View File

@ -0,0 +1,3 @@
accessing
healthy
^ healthy

View File

@ -0,0 +1,3 @@
accessing
isHealthy
^ self healthy

View File

@ -0,0 +1,5 @@
accessing
printOn: aStream
super printOn: aStream.
^ aStream
nextPutAll: '( ', self url ,' | uptime: ', self uptime asString, ' )'

View File

@ -0,0 +1,3 @@
accessing
rss
^ rss

View File

@ -0,0 +1,3 @@
accessing
uptime
^ uptime

View File

@ -0,0 +1,3 @@
accessing
url
^ url

View File

@ -0,0 +1,17 @@
{
"commentStamp" : "<historical>",
"super" : "Object",
"category" : "Socialmetrica",
"classinstvars" : [ ],
"pools" : [ ],
"classvars" : [ ],
"instvars" : [
"url",
"healthy",
"uptime",
"rss",
"version"
],
"name" : "NitterInstance",
"type" : "normal"
}

View File

@ -3,4 +3,4 @@ nitterProvider
"For a full list of Nitter providers, see: "For a full list of Nitter providers, see:
https://github.com/zedeus/nitter/wiki/Instances" https://github.com/zedeus/nitter/wiki/Instances"
^ 'https://nitter.42l.fr/' ^ 'https://nitter.net/'

View File

@ -0,0 +1,4 @@
accessing
areCommonFilesInstalled
"Returns true if common files are in the proper location, and false elsewhere."
^ ((TweetsCollection dataStore / 'commons') children collect: [ :file | file basename ] ) includesAll: self class commonFiles keys

View File

@ -1,7 +1,29 @@
accessing accessing
asDictionary asDictionary
| tweets tweetsHistogramData repliesHistogramData quotesHistogramData retweetsHistogramData |
[self config at: 'lang']
onErrorDo: [ ^ self inform: 'Please put a lang key with a language (for example: en) in the object config.' ].
tweets := self messages.
tweetsHistogramData := self tweetsByWeeksTimeSpan.
repliesHistogramData := self repliesByWeeksTimeSpan.
quotesHistogramData := self quotesReportData.
retweetsHistogramData := self retweetsReportData.
^ { 'profile-card-avatar' -> self profileImageFile fullName. ^ { 'profile-card-avatar' -> self profileImageFile fullName.
'profile-card-avatar-url' -> self profileImageUrl.
'profile-card-fullname' -> self name . 'profile-card-fullname' -> self name .
'profile-card-username' -> self userName . 'profile-card-username' -> self userName .
'profile-bio' -> self profileBio } asDictionary 'profile-bio' -> self profileBio.
'messages-size' -> tweets size.
'messages-newest' -> tweets newest created asDate greaseString.
'messages-oldest' -> tweets oldest created asDate greaseString.
'tweets-histogram-labels' -> tweetsHistogramData third.
'tweets-histogram-quantity' -> tweetsHistogramData second.
'replies-histogram-labels' -> repliesHistogramData third.
'replies-histogram-quantity' -> repliesHistogramData second.
'retweets-histogram-labels' -> retweetsHistogramData third.
'retweets-histogram-quantity' -> retweetsHistogramData second.
'quotes-histogram-labels' -> quotesHistogramData third.
'quotes-histogram-quantity' -> quotesHistogramData second.
'wordcloud-data' -> (self wordcloudDataLanguage: (self config at: 'lang')) first.
} asDictionary

View File

@ -0,0 +1,28 @@
accessing
asDictionaryForWeb
| tweets tweetsHistogramData repliesHistogramData quotesHistogramData retweetsHistogramData |
tweets := self messages.
tweetsHistogramData := self tweetsByTimeSpan: 7.
repliesHistogramData := self repliesByTimeSpan: 7.
quotesHistogramData := self quotesReportData.
retweetsHistogramData := self retweetsReportData.
^ { 'profile-card-avatar' -> self profileImageFile fullName.
'profile-card-avatar-url' -> self profileImageUrl.
'profile-card-fullname' -> self name .
'profile-card-username' -> self userName .
'profile-bio' -> self profileBio.
'messages-size' -> tweets size.
'messages-newest' -> tweets newest created asDate greaseString.
'messages-oldest' -> tweets oldest created asDate greaseString.
'tweets-histogram-labels' -> tweetsHistogramData third.
'tweets-histogram-quantity' -> tweetsHistogramData second.
'replies-histogram-labels' -> repliesHistogramData third.
'replies-histogram-quantity' -> repliesHistogramData second.
'retweets-histogram-labels' -> retweetsHistogramData third.
'retweets-histogram-quantity' -> retweetsHistogramData second.
'quotes-histogram-labels' -> quotesHistogramData third.
'quotes-histogram-quantity' -> quotesHistogramData second.
'wordcloud-data' -> self wordcloudData first.
} asDictionary

View File

@ -0,0 +1,16 @@
accessing
asTiddler
| tempDict tiddler |
tiddler := Tiddler new.
tiddler customFields.
tempDict := self asDictionary.
tempDict keysAndValuesDo: [ :key :value |
tiddler customFields at: key put: value
].
^ tiddler
created;
title: (tiddler customFields at: 'profile-card-username');
type: 'text/vnd.tiddlywiki';
text:
'<<image-card "', self profileImageUrl, '" title:', ($' asString), '<$transclude field="profile-card-fullname"/>', ($' asString), 'text:', ($' asString), '<$transclude field="profile-bio"/>', ($' asString), 'footer:', ($' asString), '@<$transclude field="profile-card-username"/>', ($' asString), 'align:"center" pos:"top">>'.

View File

@ -0,0 +1,4 @@
accessing
authorIds
^ TwitterUser storedInstances select: [ :each | each userName = self userName ] thenCollect: [ :each | each id ]

View File

@ -0,0 +1,12 @@
accessing
avatarPicture
| response profileImgFile|
profileImgFile := self profileImageFile.
profileImgFile exists
ifTrue: [ ^ (ImageReadWriter formFromFileNamed: profileImgFile fullName) asElement ].
response := ZnClient new url: (self profileImageUrl); get; response.
response contentType = ZnMimeType imageJpeg
ifTrue: [ ^ (PluginBasedJPEGReadWriter gtFromBuffer: response contents) asElement ].
response contentType = ZnMimeType imagePng
ifTrue: [ ^ (PNGReadWriter gtFromBuffer: response contents) asElement ].
^ GtABContact new avatar

View File

@ -0,0 +1,33 @@
accessing
collectRawTweetsFrom: anUrl upToPage: anInteger
| pagesDict response customQuery |
pagesDict := self getPagesContentsFrom: anUrl upTo: anInteger.
response := TweetsCollection new.
customQuery := Dictionary new
at: 'parameters' put: pagesDict keys;
at: 'date' put: DateAndTime now;
yourself.
response query: customQuery.
pagesDict keysAndValuesDo: [ :key :rawTweets | | temp |
temp := (rawTweets xpath: '//div[@class="timeline-item "]') asOrderedCollection
collect: [ :xmlElement | xmlElement postCopy ].
temp do: [ :tweet | | tempTweet |
tempTweet := Tweet new fromNitterHtmlItem: tweet.
tempTweet metadata
at: DateAndTime now asString put: key;
yourself.
response add: tempTweet.
]
].
response messages: (response messages select: [ :tweet | tweet isNotNil ]).
response messages doWithIndex: [ :tweet :i |
| current previous |
current := response messages at: i.
i < response lastIndex ifTrue: [
previous := response messages at: i + 1.
current timelines
at: self userName put: previous id;
yourself ]].
^ response.

View File

@ -0,0 +1,5 @@
accessing
collectRawTweetsFromOldestUpToPage: anInteger
^ self collectRawTweetsFrom: self oldestTweetPageCursor upToPage: anInteger

View File

@ -0,0 +1,5 @@
accessing
collectRawTweetsUpToPage: anInteger
^ self collectRawTweetsFrom: self userNameLinkWithReplies upToPage: anInteger

View File

@ -0,0 +1,8 @@
accessing
configureDefaultReportingPeriod
[ config at: 'reportingPeriod' ]
onErrorDo: [ self config
at: 'reportingPeriod'
put: (Timespan
starting: messages oldest created asDateAndTime
ending: messages newest created asDateAndTime + 1 minute) ]

View File

@ -2,5 +2,7 @@ accessing
createdAt createdAt
^ createdAt ifNil: [| joinDateString | ^ createdAt ifNil: [| joinDateString |
joinDateString := ((self documentTree xpath: '//div[@class="profile-joindate"]/span/@title') stringValue). joinDateString := ((self documentTree xpath: '//div[@class="profile-joindate"]/span/@title') stringValue).
createdAt := (ZTimestampFormat fromString:'4:05 PM - 03 Feb 2001') parse: joinDateString.
createdAt := (ZTimestampFormat fromString:'4:05 PM - 3 Feb 2001') parse: joinDateString.
createdAt := createdAt asDateAndTime
] ]

View File

@ -1,5 +0,0 @@
accessing
defaultConfig
self config: { 'folder' -> (FileLocator userData / 'Socialmetrica' / self userName) } asDictionary.
^ self config

View File

@ -1,3 +1,3 @@
operation operation
documentTree documentTree
^ XMLHTMLParser parse: self userNameLink asUrl retrieveContents ^ self documentTreeFor: self userNameLinkWithReplies

View File

@ -0,0 +1,3 @@
accessing
documentTreeFor: anUrl
^ XMLHTMLParser parse:anUrl asUrl retrieveContents

View File

@ -1,4 +1,12 @@
accessing accessing
downloadProfileImage downloadProfileImage
^ self exportProfileImageOn: self folder / self userName, 'jpg' self remoteIsFound
ifTrue: [ ^ self exportProfileImageOn: self folder / 'profile-image', 'jpg' ]
ifFalse: [ | tempFile |
tempFile := (self folder / 'profile-image', 'jpg') asFileReference.
tempFile ensureCreateFile.
tempFile binaryWriteStreamDo: [ :stream |
stream nextPutAll: 'https://mutabit.com/repos.fossil/mutabit/uv/wiki/commons/twitter-user-image-default.jpg' asUrl retrieveContents.
super class inform: 'Exported as: ', String cr, tempFile fullName.
^ self folder / 'profile-image', 'jpg' ]]

View File

@ -0,0 +1,40 @@
as yet unclassified
downloadWithRenaming: fileReference
| file tempFile fileHash tempFileHash |
file := fileReference asFileReference .
tempFile := FileLocator temp / fileReference basename.
tempFile ensureCreateFile.
tempFile binaryWriteStreamDo: [ :stream |
stream nextPutAll: profileImageUrl asUrl retrieveContents.
super class inform: 'Exported as: ', String cr, tempFile fullName.
].
OSSUnixSubprocess new
command: 'openssl';
arguments: { 'dgst' . '-sha256' . file fullName};
workingDirectory: (self folder) fullName;
redirectStdout;
redirectStderr;
runAndWaitOnExitDo: [ :process :outString |
fileHash := (outString splitOn: ' ') second trimmed].
OSSUnixSubprocess new
command: 'openssl';
arguments: { 'dgst' . '-sha256' . tempFile fullName};
workingDirectory: (self folder) fullName;
redirectStdout;
redirectStderr;
runAndWaitOnExitDo: [ :process :outString |
tempFileHash := (outString splitOn: ' ' ) second trimmed].
fileHash = tempFileHash
ifFalse: [
file copyTo: self folder /
(file basenameWithoutExtension , '-',
('-' join:
((file creationTime asLocalStringYMDHM) splitOn: ' ')), '.jpg').
file ensureDelete.
^ { 'Profile image changed' ->
(tempFile moveTo: file)} asDictionary ].
^ { 'Same Profile Image' -> file } asDictionary

View File

@ -0,0 +1,6 @@
accessing
exportDefaultReport
(self hasFolder: 'templates')
ifFalse: [ self installTemplate ].
^ self exportWithTemplate: (TweetsCollection dataStore / 'templates' / 'template.mus.tex') into: self folder

View File

@ -0,0 +1,9 @@
accessing
exportEmptyHistogramNamed: aDictionary
| histogram |
histogram := RSChart new.
histogram extent: (aDictionary at: 'extent').
histogram build.
histogram canvas exportAsFileNamed: (aDictionary at: 'messagesType'), '-histogram' into: self folder.
^ self

View File

@ -0,0 +1,49 @@
accessing
exportHistogramFor: aDictionary By: aTypeString
"TODO: quotes and retweets"
| messagesDict histogram diagram tempMessages labels subtotals |
tempMessages := self perform: (aDictionary at: 'messagesType') asSymbol.
tempMessages ifEmpty: [ self exportEmptyHistogramNamed:
(aDictionary at: 'messagesType'), '-histogram' ].
((((aDictionary at: 'messagesType') = 'tweets') or: [(aDictionary at: 'messagesType') = 'replies'])
and: aTypeString isNumber)
ifTrue: [ messagesDict := tempMessages splitBytimeSpansOf: aTypeString ].
(aTypeString = 'day' or: [ aTypeString = 'days' ])
ifTrue: [ messagesDict := tempMessages splitByDays ].
(aTypeString = 'week' or: [ aTypeString = 'weeks' ])
ifTrue: [ messagesDict := tempMessages splitByWeeks ].
(((aDictionary at: 'messagesType') = 'retweets') or: [ (aDictionary at: 'messagesType') = 'quotes' ])
ifTrue: [
messagesDict := tempMessages asMessagesUserNamesSortedByOccurrences.
((aTypeString > messagesDict size) or: [ aTypeString = 1 ])
ifFalse: [ | keysToRemove |
keysToRemove := OrderedCollection new.
1 to: messagesDict size - aTypeString do:
[ :i | keysToRemove add: (messagesDict keys at: i + aTypeString) ].
messagesDict removeKeys: keysToRemove. ].
labels := messagesDict keys.
labels := labels collect: [ :profiles | ('@', profiles) ].
subtotals := messagesDict values
]
ifFalse: [ labels := messagesDict keys.
subtotals := (messagesDict values collect: [ :collection | collection size ])].
histogram := RSChart new.
histogram extent: (aDictionary at: 'extent').
diagram := RSBarPlot new
y: subtotals.
diagram color: (aDictionary at: 'color').
histogram addPlot: diagram.
histogram addDecoration: (RSHorizontalTick new
fromNames: labels;
labelRotation: 0;
fontSize: 68 /messagesDict size;
yourself).
histogram addDecoration: (RSVerticalTick new
integer;
fontSize: 68 /messagesDict size).
histogram build.
^ histogram canvas exportAsFileNamed: (aDictionary at: 'messagesType'), '-histogram' into: self folder

View File

@ -0,0 +1,11 @@
accessing
exportOverviewReportLatex
| weeks floor divisions |
weeks := ((self newestTweet created - self oldestTweet created) days / 7).
floor := weeks floor.
(weeks - floor) > 0.4
ifTrue: [ divisions := floor ]
ifFalse: [ divisions := floor + 1 ].
self exportOverviewReportLatexWithBars: divisions.
^ self folder

View File

@ -0,0 +1,11 @@
accessing
exportOverviewReportLatexWithBars: anInteger
self
exportDefaultReport;
externalWordCloud;
exportTweetsHistogramWithBars: anInteger;
exportRetweetsHistogramWithBars: anInteger;
exportRepliesHistogramWithBars: anInteger;
exportQuotesHistogramWithBars: anInteger.
^ self folder

View File

@ -3,9 +3,11 @@ exportProfileImageOn: fileReference
| file | | file |
file := fileReference asFileReference. file := fileReference asFileReference.
file ensureDelete. file exists ifFalse: [
file exists ifFalse: [ file ensureCreateFile ]. file ensureCreateFile.
file binaryWriteStreamDo: [ :stream | file binaryWriteStreamDo: [ :stream |
stream nextPutAll: profileImageUrl retrieveContents ]. stream nextPutAll: profileImageUrl asUrl retrieveContents ].
super class inform: 'Exported as: ', String cr, file fullName. self class inform: 'Exported as: ', String cr, file fullName.
^ file ^ file
].
self downloadWithRenaming: fileReference

View File

@ -0,0 +1,49 @@
accessing
exportQuotesHistogramWithBars: aNumberOfBars
| quotesDict |
quotesDict := {
'messagesType' -> 'quotes'.
'extent' -> (800@200).
'color' -> (Color r:(89/255) g:(217/255) b:(95/255))
} asDictionary.
^ self exportHistogramFor: quotesDict By: aNumberOfBars
"| keysToRemove quotes labels quotesHistogram diagram |
quotes := self quotes asMessagesUserNamesSortedByOccurrences.
(aNumberOfBars > quotes size) ifTrue: [ ^ self exportQuotesHistogram ].
keysToRemove := OrderedCollection new.
1 to: quotes size - aNumberOfBars do:
[ :i | keysToRemove add: (quotes keys at: i + aNumberOfBars) ].
quotes removeKeys: keysToRemove.
labels := quotes keys.
labels := labels collect: [ :profiles | ('@', profiles) ].
quotesHistogram := RSChart new.
quotesHistogram extent: 800@200.
diagram := RSBarPlot new
y: quotes values.
diagram color: (Color r:(89/255) g:(217/255) b:(95/255)).
quotesHistogram addPlot: diagram.
quotesHistogram addDecoration: (RSHorizontalTick new
fromNames: labels;
labelRotation: 0;
fontSize: 72 /quotes size;
yourself).
quotesHistogram addDecoration: (RSVerticalTick new
asFloat: 2;
fontSize: 72 /quotes size).
quotesHistogram build.
quotesHistogram canvas pdfExporter
zoomToShapes;
noFixedShapes;
fileName: (self folder / 'quotes-histogram.pdf')fullName;
export.
quotesHistogram canvas pngExporter
zoomToShapes;
noFixedShapes;
fileName: (self folder / 'quotes-histogram.png')fullName;
export.
^ self folder / 'quotes-histogram.png'"

View File

@ -0,0 +1,10 @@
accessing
exportRepliesHistogramWithBars: aNumberOfBars
| repliesDict |
repliesDict := {
'messagesType' -> 'replies'.
'extent' -> (800@200).
'color' -> (Color r:(246/255) g:(185/255) b:(46/255))
} asDictionary.
^ self exportHistogramFor: repliesDict By: aNumberOfBars

View File

@ -0,0 +1,10 @@
accessing
exportRetweetsHistogramWithBars: aNumberOfBars
| retweetsDict |
retweetsDict := {
'messagesType' -> 'retweets'.
'extent' -> (800@200).
'color' -> (Color r:(217/255) g:(56/255) b: (124/255))
} asDictionary.
^ self exportHistogramFor: retweetsDict By: aNumberOfBars

View File

@ -0,0 +1,8 @@
accessing
exportStaticWebReport
(self hasFolder: 'commons')
ifFalse: [ self installCommons ].
(self hasFolder: 'templates')
ifFalse: [ self installTemplate ].
^ self exportWithTemplate: (TweetsCollection dataStore / 'templates' / 'index.mus.html') into: self folder

View File

@ -0,0 +1,10 @@
accessing
exportTweetsHistogramWithBars: aNumberOfBars
| tweetsDict |
tweetsDict := {
'messagesType' -> 'tweets'.
'extent' -> (800@200).
'color' -> (Color r:(91/255) g:(131/255) b:(222/255))
} asDictionary.
^ self exportHistogramFor: tweetsDict By: aNumberOfBars

View File

@ -0,0 +1,11 @@
accessing
exportWeekReportLatexBeginningAt: aDay
self config
at: 'reportingPeriod'
put: (Timespan starting: aDay asDateAndTime ending: ( aDay asDateAndTime + 7 days)).
self messages: self messages.
self
exportDefaultReport;
externalWordCloud.
^ self folder

View File

@ -1,12 +1,29 @@
accessing accessing
exportWithTemplate: mustacheFile into: folder exportWithTemplate: mustacheFile into: folder
| tempDictionary modified | | tempDictionary bioModified userModified nameModified |
tempDictionary := self asDictionary copy. tempDictionary := self asDictionary copy.
modified := self asDictionary at: 'profile-bio'. (mustacheFile fullName endsWith: '.html')
modified := modified copyReplaceAll: '@' with: '\@'. ifTrue: [ ^ MarkupFile
modified := modified copyReplaceAll: '_' with: '\_'. exportAsFileOn: (folder / self userName , 'html')
tempDictionary at: 'profile-bio' put: modified. containing:(mustacheFile asMustacheTemplate value: tempDictionary)].
MarkupFile
exportAsFileOn: (folder / self userName , 'tex') (mustacheFile fullName endsWith: '.tex')
containing:(mustacheFile asMustacheTemplate value: tempDictionary) ifTrue: [ bioModified := self asDictionary at: 'profile-bio'.
bioModified := bioModified copyReplaceAll: '@' with: '\@'.
bioModified := bioModified copyReplaceAll: '_' with: '\_'.
bioModified := bioModified copyReplaceAll: '#' with: '\#'.
bioModified := bioModified copyReplaceAll: '👑😎' with: ''.
bioModified := bioModified copyReplaceAll: '🇨🇴' with: ''.
bioModified := bioModified copyReplaceAll: '|' with: ''.
userModified := self asDictionary at: 'profile-card-username'.
userModified := userModified copyReplaceAll: '_' with: '\_'.
nameModified := self asDictionary at: 'profile-card-fullname'.
nameModified := nameModified copyReplaceAll: '🇨🇴' with: ''.
tempDictionary at: 'profile-bio' put: bioModified;
at: 'profile-card-username' put: userModified;
at: 'profile-card-fullname' put: nameModified.
^ MarkupFile
exportAsFileOn: (folder / self userName , 'tex')
containing:(mustacheFile asMustacheTemplate value: tempDictionary)]

View File

@ -1,9 +1,13 @@
accessing accessing
externalWordCloud externalWordCloud
"TO DO: refactor with externalWordCloudWithLanguage:"
| text outputFile | | text outputFile |
outputFile := (self folder / 'nube.png')fullName. self areCommonFilesInstalled
text := (self folder / self userName, 'words', 'txt')fullName. ifFalse: [ self installCommons ].
self writeWordsFile.
outputFile := (self folder / 'wordcloud.png') fullName.
text := (self folder / 'words', 'txt') fullName.
OSSUnixSubprocess new OSSUnixSubprocess new
command: 'wordcloud_cli'; command: 'wordcloud_cli';
arguments: { '--text' . text . arguments: { '--text' . text .
@ -13,8 +17,9 @@ externalWordCloud
'--height' . '357' . '--height' . '357' .
'--background' . 'white' . '--background' . 'white' .
'--mode' . 'RGBA' . '--mode' . 'RGBA' .
'--stopwords' . '../commons/stopwords-es.txt' . '--stopwords' . '../../../commons/stopwords-es.txt'.
'--mask' . '../commons/nube-mascara.jpg'}; "'--mask' . '../../../commons/nube-mascara.jpg'"
};
workingDirectory: self folder fullName; workingDirectory: self folder fullName;
redirectStdout; redirectStdout;
redirectStderr; redirectStderr;

View File

@ -0,0 +1,26 @@
accessing
externalWordCloudWithLanguage: language
"I proces and render a wordcloud image without stopwords in: es and en."
| text outputFile |
self areCommonFilesInstalled
ifFalse: [ self installCommons ].
self writeWordsFile.
outputFile := (self folder / 'wordcloud.png') fullName.
text := (self folder / 'words', 'txt') fullName.
OSSUnixSubprocess new
command: 'wordcloud_cli';
arguments: { '--text' . text .
'--imagefile' . outputFile .
'--color' . '#5B83DE' .
'--width' . '1153' .
'--height' . '357' .
'--background' . 'white' .
'--mode' . 'RGBA' .
'--stopwords' . '../../../commons/stopwords-', language, '.txt'.
"'--mask' . '../../../commons/nube-mascara.jpg'"
};
workingDirectory: self folder fullName;
redirectStdout;
redirectStderr;
runAndWaitOnExitDo: [ :process :outString | ^ outputFile asFileReference ]

View File

@ -0,0 +1,14 @@
accessing
getLocalMessages
| allTweets myTweets tweetsWithAntecesor |
TweetsCollection storeDB.
allTweets := Tweet storedInstances asOrderedCollection.
allTweets ifNil: [ ^ nil ].
myTweets := TweetsCollection new.
tweetsWithAntecesor := allTweets select: [ :each | each timelines isNotEmpty and: [ each timelines keys first = self userName ]].
myTweets messages: tweetsWithAntecesor.
self messages: myTweets.
^ self

View File

@ -1,24 +1,5 @@
accessing accessing
getMessages getMessages
| lastTweetsRaw customQuery lastTweets |
lastTweetsRaw := self rssFeed xmlDocument xpath: '//item'. self getLocalMessages ifNil: [ self getRemoteMessagesFromHtml ].
lastTweets := TweetsCollection new. ^ self messages
customQuery := Dictionary new
at: 'parameters' put: self userNameLink;
at: 'date' put: DateAndTime now;
yourself.
lastTweets query: customQuery.
lastTweetsRaw doWithIndex: [ :rssTweet :i | | current previous |
current := Tweet new fromNitterRssItem: rssTweet.
i < lastTweetsRaw size ifTrue: [
previous := Tweet new fromNitterRssItem: (lastTweetsRaw at: i + 1).
current timelines
at: self userName put: previous id;
yourself.
].
current queries add: customQuery.
lastTweets add: current.
].
self tweets: lastTweets.
^ self tweets

View File

@ -0,0 +1,19 @@
accessing
getPagesContentsFrom: anURL upTo: anInteger
"I retroactively get all pages contents until a specified page number.
TO DO: should this be splitted back to two methods, one getting the page urls and other its content?
or do we always be getting the cursor urls and its contents all the time.
[ ] Benchmark alternative approaches."
| response nextPageLink previousPageLink |
response := OrderedDictionary new.
response at: anURL put: (self documentTreeFor: anURL).
previousPageLink := anURL.
anInteger - 1 timesRepeat: [ | pageCursor |
pageCursor := self pageCursorFor:previousPageLink.
nextPageLink := self userNameLink, '/with_replies', pageCursor keys first.
response at: nextPageLink put: (XMLHTMLParser parse:nextPageLink asUrl retrieveContents).
previousPageLink := nextPageLink
].
^ response

View File

@ -0,0 +1,4 @@
accessing
getPagesContentsFromOldestUpto: anInteger
^ self getPagesContentsFrom: ((self oldestTweet metadata select: [ :item | item isString and: [ item beginsWith: 'https://' ]]) values first) upTo: anInteger

View File

@ -0,0 +1,8 @@
accessing
getPagesContentsUpto: anInteger
"I retroactively get all pages contents until a specified page number.
TO DO: should this be splitted back to two methods, one getting the page urls and other its content?
or do we always be getting the cursor urls and its contents all the time.
[ ] Benchmark alternative approaches."
^ self getPagesContentsFrom: self userNameLinkWithReplies upTo: anInteger

View File

@ -0,0 +1,4 @@
accessing
getRemoteMessagesFromHtml
^ messages := self collectRawTweetsUpToPage: 1

View File

@ -0,0 +1,6 @@
accessing
hasFolder: folderName
| fullFolderPath |
fullFolderPath :=(TweetsCollection dataStore / folderName ).
^ fullFolderPath exists and: [ fullFolderPath children isNotEmpty ]

View File

@ -1,3 +1,7 @@
accessing accessing
id id
^ id ifNil: [id := (self profileImageUrl segments select: [ :each | each asInteger class = LargePositiveInteger]) first.] ^ id ifNil: [
self profileImageUrl
ifNil: [ id := 0 ]
ifNotNil: [ id := (self profileImageUrl asUrl segments select: [ :each | each isAllDigits ]) first. ]
]

View File

@ -0,0 +1,22 @@
accessing
installCommons
| commonFiles folder |
commonFiles := #(
'https://mutabit.com/repos.fossil/mutabit/raw?name=wiki/commons/stopwords-es.txt&ci=tip'
'https://mutabit.com/repos.fossil/mutabit/raw?name=wiki/commons/stopwords-en.txt&ci=tip'
'https://mutabit.com/repos.fossil/mutabit/uv/wiki/commons/nube-mascara.jpg'
'https://mutabit.com/repos.fossil/mutabit/uv/wiki/commons/logo-mutabit-negro.png').
folder := TweetsCollection dataStore / 'commons'.
folder exists
ifTrue: [ folder ensureDeleteAllChildren ]
ifFalse: [ folder ensureCreateDirectory].
commonFiles do: [ :fileUrl | | temp |
ZnClient new
url: fileUrl;
downloadTo: folder.
temp := (folder children select: [ :file | file basename includesSubstring: 'raw' ]).
temp isNotEmpty ifTrue: [
temp first renameTo: (((fileUrl splitOn: 'raw?') second splitOn: '/') last removeSuffix: '&ci=tip')].
].
^ folder

View File

@ -0,0 +1,22 @@
accessing
installTemplate
| templateFiles folder |
templateFiles := #(
'https://mutabit.com/repos.fossil/mutabit/raw?name=plantillas/TwentySecondsCV/twentysecondcvMod.cls&ci=tip'
'https://mutabit.com/repos.fossil/mutabit/raw?name=plantillas/TwentySecondsCV/template.mus.tex&ci=tip'
'https://mutabit.com/repos.fossil/mutabit/raw?name=plantillas/SarissaPersonalBlog/index.mus.html&ci=tip'
'https://mutabit.com/repos.fossil/mutabit/raw?name=plantillas/SarissaPersonalBlog/output.css&ci=tip'
'https://mutabit.com/repos.fossil/mutabit/raw?name=plantillas/SarissaPersonalBlog/echarts-wordcloud.min.js&ci=tip').
folder := TweetsCollection dataStore / 'templates'.
folder exists
ifTrue: [ folder ensureDeleteAllChildren ]
ifFalse: [ folder ensureCreateDirectory].
templateFiles do: [ :fileUrl |
ZnClient new
url: fileUrl;
downloadTo: folder.
(folder children detect: [ :file | file basename includesSubstring: 'raw' ])
renameTo: (((fileUrl splitOn: 'raw?') second splitOn: '/') last removeSuffix: '&ci=tip')
].
^ folder

View File

@ -0,0 +1,5 @@
accessing
lastTweetsFromHtml
^ (self documentTree xpath: '//div[@class="timeline-item "]')
asOrderedCollection collect: [ :xmlElement | xmlElement postCopy ]

View File

@ -0,0 +1,12 @@
accessing
messages
messages ifNil: [ messages := TweetsCollection new ].
messages ifEmpty: [ self getLocalMessages ].
messages ifEmpty: [ self getRemoteMessagesFromHtml ].
config at: 'reportingPeriod' ifAbsent: [ ^ messages ].
"self configureDefaultReportingPeriod."
^ messages
select: [ :message |
message created
between: self reportingPeriod start
and: self reportingPeriod end ]

View File

@ -1,3 +1,6 @@
accessing accessing
name name
^ name ifNil: [ name := ((self rssFeed requiredItems title) splitOn: '/') first ]
| documentTree |
documentTree := [ self documentTree ] onErrorDo: [ ^ nil ].
^ name ifNil: [ name := (documentTree xpath: '//div[@class="profile-card-tabs-name"]//a[@class="profile-card-fullname"]') stringValue ]

View File

@ -0,0 +1,4 @@
accessing
newestTweet
^ (self tweets select: [ :tweet | tweet created = ((self tweets collect: [ :each | each created ]) asSortedCollection last)]) first.

View File

@ -1,18 +0,0 @@
accessing
numberOfURLsForLoadingTweets: number
| collectionURLs count asURLs |
collectionURLs := {
self userNameLink .
(self userNameLink, ((self documentTree xPath: '//a[.="Load more"]') @ 'href') stringValue) .} asOrderedCollection.
number <= 2 ifTrue: [ ^ collectionURLs ].
count := 2.
(number-count) timesRepeat: [ | tempDoc |
tempDoc := XMLHTMLParser parse: (collectionURLs at: count) asUrl retrieveContents.
collectionURLs
add: (self userNameLink,
((tempDoc xPath: '//a[.="Load more"]') @ 'href') stringValue).
count := count+1 ].
asURLs := collectionURLs collect: [ :string | string asUrl ].
^ asURLs.

View File

@ -0,0 +1,4 @@
accessing
oldestTweet
^ (self tweets select: [ :tweet | tweet created = ((self tweets collect: [ :each | each created ]) asSortedCollection first)]) first.

View File

@ -0,0 +1,3 @@
accessing
oldestTweetPageCursor
^ (self oldestTweet metadata select: [ :item | item isString and: [ item beginsWith: 'https://' ]]) values first value

View File

@ -0,0 +1,10 @@
accessing
pageCursorFor: anUrl
| response value key |
response := Dictionary new.
value := self documentTreeFor: anUrl.
key := ((value xpath: '//a[.="Load more"]') @ 'href')stringValue.
^ response
at: key put: value;
yourself

View File

@ -0,0 +1,14 @@
accessing
pageDocTrees: anInteger
| response nextPageLink previousPageLink |
response := OrderedDictionary new.
previousPageLink := self userNameLink.
response add: previousPageLink.
anInteger - 1 timesRepeat: [
nextPageLink := self userNameLink, (self pageCursorFor:previousPageLink) value.
response add: nextPageLink.
previousPageLink := nextPageLink
].
^ response

View File

@ -2,6 +2,8 @@ accessing
profileImageFile profileImageFile
| file | | file |
file := (self folder / self userName, 'jpg'). file := (self folder / 'profile-image', 'jpg').
file exists ifTrue: [ ^ file ]. file exists ifTrue: [ file asFileReference size = 0
ifTrue:[ ^ self downloadProfileImage ].
^ file. ].
^ self downloadProfileImage ^ self downloadProfileImage

View File

@ -1,4 +1,8 @@
accessing accessing
profileImageUrl profileImageUrl
^ profileImageUrl ifNil: [
profileImageUrl := ((self rssFeed xmlDocument xpath: '//image/url') stringValue copyReplaceAll: '%2F' with: '/') asUrl ] | documentTree |
self remoteIsFound
ifFalse:[ ^ nil ].
documentTree := self documentTree.
^ profileImageUrl := self class nitterProvider, (((documentTree xpath: '//div[@class="profile-card-info"]//a[@class="profile-card-avatar"]') @ 'href') stringValue copyReplaceAll: '%2F' with: '/') copyWithoutFirst

View File

@ -0,0 +1,9 @@
accessing
quotes
self messages ifEmpty: [ self getMessages ].
^ TweetsCollection new
messages: (self messages
select: [ :each |
(each metadata at: 'quote') isNotEmpty and: [ each user userName = self userName] ]);
yourself

View File

@ -0,0 +1,22 @@
accessing
quotesReportData
| tempDict labels xAxis |
self quotes isEmpty
ifTrue: [ ^ { OrderedDictionary new.
('[''', '0', ''']').
('[''', 'No quotes', ''']')} ].
tempDict := self quotes asMessagesUserNamesSortedByOccurrences.
tempDict size > 10 ifTrue: [
tempDict := (tempDict associations copyFrom: 1 to: 10) asOrderedDictionary ].
labels := tempDict keys.
labels := labels collect: [ :profile | ($' asString), '@', profile, ($' asString) ].
xAxis := tempDict values.
xAxis := xAxis collect: [ :value | ($' asString), (value asString), ($' asString) ].
^ {
tempDict.
('[', (',' join: xAxis), ']').
('[', (',' join: labels), ']').
}

View File

@ -0,0 +1,4 @@
accessing
refreshProfileImageUrl
self profileImageUrl: nil.
self profileImageUrl

View File

@ -0,0 +1,5 @@
accessing
remoteIsFound
^ (ZnClient new url: (self userNameLink asUrl); get; response) isNotFound not

View File

@ -0,0 +1,7 @@
accessing
replies
self messages ifEmpty: [ self getMessages ].
^ TweetsCollection new
messages: (self tweets select: [ :each | (each metadata at: 'replie to') isNotEmpty]);
yourself

View File

@ -0,0 +1,19 @@
accessing
repliesByTimeSpan: divisions
| tweetsByTimeSpan xAxis labels |
tweetsByTimeSpan := self collectMessages: [ self replies ] byTimeSpanSplits: divisions.
xAxis := OrderedCollection new.
(tweetsByTimeSpan values collect: [ :collection | collection size ]) do: [ :number |
xAxis add: ($' asString), (number asString), ($' asString)
].
labels := OrderedCollection new.
tweetsByTimeSpan keys do: [ :string |
labels add: ($' asString), string, ($' asString)
].
^ {
tweetsByTimeSpan.
('[', (',' join: xAxis), ']').
('[', (',' join: labels), ']').
}

View File

@ -0,0 +1,23 @@
accessing
repliesByWeeksTimeSpan
| tweetsByTimeSpan xAxis labels |
self replies isEmpty
ifTrue: [ ^ { OrderedDictionary new.
('[''', '0', ''']').
('[''', 'No replies', ''']')} ].
tweetsByTimeSpan := self collectMessages: [ self replies ] byTimeSpanSplits: self tweetsDivisionsByWeeks.
xAxis := OrderedCollection new.
(tweetsByTimeSpan values collect: [ :collection | collection size ]) do: [ :number |
xAxis add: ($' asString), (number asString), ($' asString)
].
labels := OrderedCollection new.
tweetsByTimeSpan keys do: [ :string |
labels add: ($' asString), string, ($' asString)
].
^ {
tweetsByTimeSpan.
('[', (',' join: xAxis), ']').
('[', (',' join: labels), ']').
}

View File

@ -0,0 +1,4 @@
private
repliesFromImage
"Used for testing external classes, disposable"
^ self tweetsFromImage select: [ :tweet | (tweet metadata at: 'replie to') isNotEmpty]

View File

@ -0,0 +1,5 @@
accessing
reportingPeriod
^ self config
at: 'reportingPeriod'
ifAbsentPut: [ { self messages oldest created . self messages newest created } ]

View File

@ -1,6 +1,7 @@
accessing accessing
retrieveContents retrieveContents
self userName ifNil: [^ self]. self userName ifNil: [^ self].
" self retrieveLocalContents ifNotNil: [ ^ self ]."
^ self ^ self
id; id;
name; name;

View File

@ -0,0 +1,6 @@
accessing
retrieveLocalContents
| profileTemp |
profileTemp := self class stored detect: [ :each | each userName = self userName ].
profileTemp getLocalMessages.
^ profileTemp

View File

@ -0,0 +1,7 @@
accessing
retweets
self messages ifEmpty: [ self getMessages ].
^ TweetsCollection new
messages: (self messages select: [ :each | each authorId ~= self id]);
yourself

View File

@ -0,0 +1,18 @@
accessing
retweetsReportData
| tempDict labels xAxis |
tempDict := self retweets asMessagesUserNamesSortedByOccurrences.
tempDict size > 10 ifTrue: [
tempDict := (tempDict associations copyFrom: 1 to: 10) asOrderedDictionary ].
labels := tempDict keys.
labels := labels collect: [ :profile | ($' asString), '@', profile, ($' asString) ].
xAxis := tempDict values.
xAxis := xAxis collect: [ :value | ($' asString), (value asString), ($' asString) ].
^ {
tempDict.
('[', (',' join: xAxis), ']').
('[', (',' join: labels), ']').
}

View File

@ -1,4 +0,0 @@
accessing
rssFeed
^ RSSTools createRSSFeedFor: self userNameLink , '/rss'

View File

@ -4,10 +4,10 @@ storeContents
| objectString directory tempFile oldFile dehidratated | | objectString directory tempFile oldFile dehidratated |
dehidratated := self copy. dehidratated := self copy.
dehidratated tweets: nil. dehidratated messages: nil.
objectString := STON toStringPretty: dehidratated. objectString := STON toStringPretty: dehidratated.
directory := (FileLocator userData / 'Socialmetrica' / self userName) ensureCreateDirectory. directory := self folder ensureCreateDirectory.
oldFile := directory / self userName, 'ston'. oldFile := directory / 'profile', 'ston'.
oldFile exists ifFalse: [ oldFile exists ifFalse: [
^ MarkupFile exportAsFileOn: oldFile containing: objectString ]. ^ MarkupFile exportAsFileOn: oldFile containing: objectString ].

View File

@ -0,0 +1,14 @@
accessing
tweets
| subcollection |
self messages ifEmpty: [ self getMessages ].
"TO DO: It seems that Twitter changes the user id, which we thought was unchangeable. This deals with the detected changes so far."
subcollection := self messages
select: [ :each | self authorIds includes: each authorId ].
(#('FranciaMarquezM' 'sandralajas' 'IBetancourtCol' 'sergio_fajardo' 'ingrodolfohdez' 'CastilloMarelen') includes: self userName)
ifFalse: [ subcollection := self messages select: [ :each | each authorId = self id ]].
^ TweetsCollection new
messages: subcollection;
yourself

View File

@ -0,0 +1,19 @@
accessing
tweetsByTimeSpan: divisions
| tweetsByTimeSpan xAxis labels |
tweetsByTimeSpan := self collectMessages: [ self tweets] byTimeSpanSplits: divisions.
xAxis := OrderedCollection new.
(tweetsByTimeSpan values collect: [ :collection | collection size ]) do: [ :number |
xAxis add: ($' asString), (number asString), ($' asString)
].
labels := OrderedCollection new.
tweetsByTimeSpan keys do: [ :string |
labels add: ($' asString), string, ($' asString)
].
^ {
tweetsByTimeSpan.
('[', (',' join: xAxis), ']').
('[', (',' join: labels), ']').
}

View File

@ -0,0 +1,19 @@
accessing
tweetsByWeeksTimeSpan
| tweetsByTimeSpan xAxis labels |
tweetsByTimeSpan := self collectMessages: [ self tweets] byTimeSpanSplits: self tweetsDivisionsByWeeks.
xAxis := OrderedCollection new.
(tweetsByTimeSpan values collect: [ :collection | collection size ]) do: [ :number |
xAxis add: ($' asString), (number asString), ($' asString)
].
labels := OrderedCollection new.
tweetsByTimeSpan keys do: [ :string |
labels add: ($' asString), string, ($' asString)
].
^ {
tweetsByTimeSpan.
('[', (',' join: xAxis), ']').
('[', (',' join: labels), ']').
}

View File

@ -0,0 +1,12 @@
accessing
tweetsDivisionsByWeeks
| weeks floor divisions |
weeks := ((self newestTweet created - self oldestTweet created) days / 7).
floor := weeks floor.
(weeks - floor) > 0.4
ifTrue: [ floor = 0
ifTrue: [ divisions := 1 ]
ifFalse: [ divisions := floor]]
ifFalse: [ divisions := floor + 1 ].
^ divisions

Some files were not shown because too many files have changed in this diff Show More