List of commits:
Subject Hash Author Date (UTC)
feedtree: implement direct work with curl's easy interface 7ff0dc14b7a436545cedf9ea3d3c8e724dba323c Alex 2022-07-27 02:36:08
IvoRSS: prepare optional parametr UseDefaultDownloader c167731044a81a2091f93557d7d908f671c20133 Alex 2022-07-26 02:32:56
feedtree: prepare custom curl handling 70eb6c5f40d698b4ad017752f54daed1a09647d5 Alex 2022-07-26 02:01:24
feedtree: explicitly mark when GroupFetch is ready and show time spent 83fe01fb883fcc66849da02eecbdc61f6ea52b99 Alex 2022-07-25 01:18:47
feedtree: rework fetcher to not rely on guids 760fc788f0297847469290a5c75e3df95da3cfe8 Alex 2022-07-24 22:25:26
IvoRSS: change version to 1.1 It's time to release! 8f883741fce50dc3e9c869c2e41e1fb8b67b0048 Alex 2022-07-24 04:19:12
feeds: add even more Amiga related feeds b19a07dcdd1f198e6fff88f13c16962e6ab73084 Alex 2022-07-24 04:18:08
feedtree: reload rss channel whenever user decides to fetch 52c6dde8b832490b873b02d2157da3bc81b73e54 Alex 2022-07-24 04:10:09
lurk: seru na to, parser spouštím pro každej link zvlášť 5b0a792527510716e36da04e36689765907500c2 Alex 2022-07-24 02:06:56
do not let text labels to explode eeec1ccb6b0c3c90e2e0803021817a07489c1093 Alex 2022-07-24 01:36:32
IvoR: use title and guid as unique identificators. Hash them to avoid numerous issues with serializer 5a6fe4419c9cfdd0a03e18dfb45cf6fb47fff582 Alex 2022-07-24 01:16:36
lurk: use raw encoding where is possible. Use italic font style for possible rss channels 102d33ee8611a392b048d4c640bf27d9ad93d6a8 Alex 2022-07-23 23:34:34
feedtree: bacha na mezery v indexech d286760ac19d7b5cf4c3b591a2818f37f2ece3d7 Alex 2022-07-23 23:32:34
textfiled: malinkatá puntičkářská změna 4dc5b68943d1b28fa4f83bd4289e3887668b668f Alex 2022-07-23 21:58:13
feedtree: guid-less channels workaround 99e7e725b1e5ec6a3682b2fbd670f3a719d5727c Alex 2022-07-23 02:14:23
lurk: rework links parsing 27d2aa01be8796427acc9753918a25ca336a523f Alex 2022-07-23 01:50:51
lurk: accept more possible MIME types 851689d4324eea12e0c2ecca65557a130006d253 Alex 2022-07-22 23:58:37
pridat mezeru 9e6e85bc191e313bbe842052bd92671bfece74c1 Alex 2022-07-21 02:18:44
add Readme file 0a25299e82fc60cd3d4f2b1fdc0c7047b299bf78 Alex 2022-07-20 01:54:53
add LICENSE file f073c68dbdfd517bdf39f2700369ecfb37f74a2d Alex 2022-07-19 10:12:47
Commit 7ff0dc14b7a436545cedf9ea3d3c8e724dba323c - feedtree: implement direct work with curl's easy interface
Author: Alex
Author date (UTC): 2022-07-27 02:36
Committer name: Alex
Committer date (UTC): 2022-07-27 02:36
Parent(s): c167731044a81a2091f93557d7d908f671c20133
Signer:
Signing key:
Signing status: N
Tree: 0c685082e5cb2c44652e86ff20efff3b2940dc0c
File Lines added Lines deleted
feedtree.hws 109 54
File feedtree.hws changed (mode: 100755) (index 0211706..63f7164)
... ... Function p_FetchAll()
54 54 p_Unlock() p_Unlock()
55 55 EndFunction EndFunction
56 56
57
58 Function progress_curl_callback(total, count, utotal, ucount, url$)
59 mui.Set("status", "Contents",
60 "Downloading (" .. count .. "/" .. total .. ") from " .. url$)
61 CheckEvents() ; keep MUI responsive
62 Return(PleaseStop)
63 EndFunction
64
65 Function write_curl_callback(data$, t)
66 InsertItem(t, data$)
67 CheckEvents() ; keep MUI responsive
68 If PleaseStop Then Return(#CURL_WRITEFUNC_PAUSE)
69 EndFunction
70
71
72
57 73 Function p_TimeFetch(muiid) Function p_TimeFetch(muiid)
58 74 Local tid = StartTimer(Nil) Local tid = StartTimer(Nil)
59 p_FetchGroup(muiid)
75 Local e = Nil
76 If UseDefaultDownloader
77 p_FetchGroup(muiid, Nil)
78 Else
79 e = hurl.Easy()
80 e:SetOpt_Accept_Encoding("")
81 ;e:SetOpt_AutoReferer(1)
82 e:SetOpt_ConnectTimeout(3)
83 e:SetOpt_FileTime(1)
84 e:SetOpt_FailOnError(1)
85 e:SetOpt_FollowLocation(1)
86 e:SetOpt_NoProgress(0)
87 e:SetOpt_TimeCondition(1) ;#HURL_TIMECOND_IFMODSINCE undefined
88 e:SetOpt_UserAgent("IvoRSS")
89 ;e:SetOpt_SSL_FalseStart(1)
90 ;e:SetOpt_TCP_FastOpen(1)
91
92 p_FetchGroup(muiid, e)
93
94 ; destroy easy object
95 e:Close()
96 EndIf
97
60 98 If PleaseStop If PleaseStop
61 99 mui.Set("status", "Contents", "job is interrupted") mui.Set("status", "Contents", "job is interrupted")
62 100 Else Else
 
... ... Function p_FetchCurrent()
74 112 If t.Node If t.Node
75 113 p_TimeFetch(t.muiid) p_TimeFetch(t.muiid)
76 114 Else Else
77 Local status = p_Fetch(t.id, False)
115 Local status = p_Fetch(t.id, Nil)
116 CollectGarbage()
78 117 If status = #FEED_STATUS_NEW If status = #FEED_STATUS_NEW
79 118 p_MarkFeed(t.id, #FEED_STATUS_READ) p_MarkFeed(t.id, #FEED_STATUS_READ)
80 119 Ivor:Load(t.id) Ivor:Load(t.id)
 
... ... EndFunction
107 146 ;č zjistit co je Node zač, ;č zjistit co je Node zač,
108 147 ;č podle toho postupovat ;č podle toho postupovat
109 148 ;č musí být rekurzivní ;č musí být rekurzivní
110 Function p_FetchGroup(muiid$)
149 Function p_FetchGroup(muiid$, e)
111 150 Local isFound, t = mui.DoMethod(#LIST_TREE, "GetEntry", muiid$, "Head", "") Local isFound, t = mui.DoMethod(#LIST_TREE, "GetEntry", muiid$, "Head", "")
112 151 While isFound And Not PleaseStop While isFound And Not PleaseStop
113 152 If t.Node If t.Node
114 p_FetchGroup(t.muiid)
153 p_FetchGroup(t.muiid, e)
115 154 Else Else
116 p_FetchFeed(t.id)
155 p_FetchFeed(t.id, e)
117 156 EndIf EndIf
118 157 isFound, t = mui.DoMethod(#LIST_TREE, "GetEntry", t.muiid, "Next", "") isFound, t = mui.DoMethod(#LIST_TREE, "GetEntry", t.muiid, "Next", "")
119 158 Wend Wend
 
... ... EndFunction
122 161
123 162 ;č označit buď jako nové, nebo jako přečtené ;č označit buď jako nové, nebo jako přečtené
124 163 ;č smí se posílat JENOM fidy, nikoliv skupiny ;č smí se posílat JENOM fidy, nikoliv skupiny
125 Function p_FetchFeed(feedid$)
126 Local status = p_Fetch(feedid$, True)
164 Function p_FetchFeed(feedid$, e)
165 Local status = p_Fetch(feedid$, e)
166 CollectGarbage()
127 167 If status = #FEED_STATUS_SUCCESS If status = #FEED_STATUS_SUCCESS
128 168 If fd_Get(feedid$, "status") = #FEED_STATUS_NEW Then Return() If fd_Get(feedid$, "status") = #FEED_STATUS_NEW Then Return()
129 169 status = #FEED_STATUS_READ status = #FEED_STATUS_READ
 
... ... EndFunction
263 303
264 304
265 305
266
267
268
269 306 ;č stáhnout soubor ;č stáhnout soubor
270 307 ;č možné statusy ;č možné statusy
271 308 /* /*
 
... ... EndFunction
274 311 #FEED_STATUS_INTERRUPTED #FEED_STATUS_INTERRUPTED
275 312 #FEED_STATUS_ERROR #FEED_STATUS_ERROR
276 313 */ */
277 Function p_Fetch(url$, fast)
314 Function p_Fetch(url$, e)
278 315 ;č Já se bojím. Co kdyby nějaký server posílal blbosti? ;č Já se bojím. Co kdyby nějaký server posílal blbosti?
279 316 ;č Uživatel bude cvakat Fetch a mu jen "Download skipped"? ;č Uživatel bude cvakat Fetch a mu jen "Download skipped"?
280 Local timestamp = -2
317 ;
281 318 ;č objevilo, že TCP handshake je šíleně dlouhá záležitost ;č objevilo, že TCP handshake je šíleně dlouhá záležitost
282 319 ;č tohle, jako, skoro dvojnásobně prodlužuje stahování ;č tohle, jako, skoro dvojnásobně prodlužuje stahování
283 320 ;č Na druhou stranu, psát v Hollywoodu něco jako kólbeky, ;č Na druhou stranu, psát v Hollywoodu něco jako kólbeky,
284 321 ;č které by alokovaly buffery je pěkný nesmysl ;č které by alokovaly buffery je pěkný nesmysl
285 If fast And fd_Get(url$, "timestamp") > -1
286 e = hurl.Easy({URL = url$, FollowLocation = True})
287 ;e:SetOpt_Accept_Encoding("")
288 ;e:SetOpt_AutoReferer(1)
289 e:SetOpt_FileTime(1)
290 e:SetOpt_FailOnError(1)
291 e:SetOpt_Forbid_Reuse(1)
292 ;e:SetOpt_SSL_FalseStart(1)
293 ;e:SetOpt_TCP_FastOpen(1)
294 Function p_WriteData(data$)
295 DebugPrint("Data: ", data$)
296 EndFunction
297 e:SetOpt_WriteFunction(p_WriteData)
298 e:SetOpt_Nobody(1)
299 Local err_code = ?e:Perform()
300 timestamp = e:GetInfo_FileTime()
301 ; destroy easy object
302 e:Close()
303 DebugPrint(timestamp, fd_Get(url$, "timestamp"), url$)
322 If Not isNil(e) ;And timestamp > -1
323 e:SetOpt_URL(url$)
324 e:SetOpt_ProgressFunction(progress_curl_callback, url$) ;userdata
325
326 Local tdata = CreateList()
327 e:SetOpt_WriteFunction(write_curl_callback, tdata)
304 328
305 If timestamp > 0 And fd_Get(url$, "timestamp") = timestamp
306 mui.Set("status", "Contents", "Skipped downloading from " ..url$)
329 Local servertime = fd_Get(url$, "servertime")
330 If servertime > 0
331 e:SetOpt_TimeValue(servertime)
332 DebugPrint("servertime", servertime)
333 ElseIf fd_Get(url$, "localtime") > 0
334 e:SetOpt_TimeValue(fd_Get(url$, "localtime"))
335 DebugPrint("localtime", fd_Get(url$, "localtime"))
336 Else
337 e:UnsetOpt_TimeValue()
338 DebugPrint("Nothing time")
339 EndIf
340
341 ;e:SetOpt_Nobody(1)
342 Local err_code = ?e:Perform()
343 If err_code <> #ERR_NONE
344 If PleaseStop
345 ;mui.Set("status", "Contents", "interrupted")
346 Return(#FEED_STATUS_INTERRUPTED)
347 Else
348 ;č přece je nějaká chyba
349 mui.Set("status", "Contents", "\27b" .. GetErrorName(err_code))
350 Return(#FEED_STATUS_ERROR)
351 EndIf
352 ElseIf e:GetInfo_Condition_Unmet()
353 mui.Set("status", "Contents", GetErrorName(err_code)
354 .." (skip downloading from " ..url$ ..")")
355 ;č Soubor na serveru se nezměnil
356 ;č neděláme nic, vracíme úspěch
357 DebugPrint("skipped", url$)
307 358 Return(#FEED_STATUS_SUCCESS) Return(#FEED_STATUS_SUCCESS)
308 ;č Když server nechce dát čas změny, tak si to zapíšeme
309 ;č a příště už na něj nebudeme utrácet čas
310 ElseIf timestamp < 0
311 fd_Set(url$, "timestamp", timestamp)
312 EndIf
313 EndIf
359 EndIf
360
314 361
362 ;č úspěch
363 DebugPrint(e:GetInfo_FileTime())
364 Return(p_LookThrough(url$, Concat(tdata), e:GetInfo_FileTime()))
365
366 EndIf
367
368 ;
369 ; Default downloader
370 ;
371
315 372 ;č zjednodušená kontrola, pokud řetězec obsahuje procenta, ;č zjednodušená kontrola, pokud řetězec obsahuje procenta,
316 373 ;č považujeme ho za "eskejpnutý". ;č považujeme ho za "eskejpnutý".
317 374 ;č Na případ, kdy uživatel strčí do URLu osamělý znak procenta vysereme ;č Na případ, kdy uživatel strčí do URLu osamělý znak procenta vysereme
318 375 Local err_code, xml$, count = ?DownloadFile(url$, Local err_code, xml$, count = ?DownloadFile(url$,
319 {Adapter="hurl", Fail404=True, Encoded=FindStr(url$, "%", True) <> -1},
376 {Adapter="hurl", Fail404=True, Encoded=FindStr(url$, "%", True) <> -1},
320 377 p_callback, url$) p_callback, url$)
321 378 p_Replay(err_code, count .. " bytes from " .. url$ .. " transmitted") p_Replay(err_code, count .. " bytes from " .. url$ .. " transmitted")
322 379 CheckEvents() ; keep MUI even more responsive CheckEvents() ; keep MUI even more responsive
 
... ... Function p_Fetch(url$, fast)
324 381
325 382 If PleaseStop Then Return(#FEED_STATUS_INTERRUPTED) If PleaseStop Then Return(#FEED_STATUS_INTERRUPTED)
326 383
327 If fd_isEqualOrSet(url$, "checksum", CRC32Str(xml$))
328 Return(#FEED_STATUS_SUCCESS)
329 EndIf
384 ;č není chyba? Tam posíláme dál
385 Return(p_LookThrough(url$, xml$))
386 EndFunction
330 387
388
389 Function p_LookThrough(url$, xml$, servertime)
390 Local checksum = CRC32Str(xml$)
391 If fd_Get(url$, "checksum") = checksum Then Return(#FEED_STATUS_SUCCESS)
392
331 393 Local err_code = ?StringToFile(xml$, p_GetXMLname(url$)) Local err_code = ?StringToFile(xml$, p_GetXMLname(url$))
332 394 p_Replay(err_code, "Updated XML file from " ..url$ .." saved") p_Replay(err_code, "Updated XML file from " ..url$ .." saved")
333 If err_code <> #ERR_NONE
334 ;č Aha... Tak zpětně vynulujeme čeksumu, aby příště zkusil to uložit znovu
335 fd_Set(url$, "checksum", False)
336 Return(#FEED_STATUS_ERROR)
337 EndIf
338
339 ;č teprve zde uložíme otisk času
340 ;č Pěkně si uvědomuji, že případná chyba v tomto kódu zablokuje
341 ;č uživateli normální stahování
342 If fast Then fd_Set(url$, "timestamp", timestamp)
343
395 If err_code <> #ERR_NONE Then Return(#FEED_STATUS_ERROR)
344 396
397 fd_Set(url$, "checksum", checksum)
398 fd_Set(url$, "servertime", servertime)
399 fd_Set(url$, "localtime", GetTimestamp(#TIMESTAMP_UNIX))
345 400
346 401 ;č kolikrát weby dodají do RSS nějaký BuildDate, ;č kolikrát weby dodají do RSS nějaký BuildDate,
347 402 ;č kterým se RSS formálně změní. ;č kterým se RSS formálně změní.
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/iam-git/IvoRSS

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/iam-git/IvoRSS

Clone this repository using git:
git clone git://git.rocketgit.com/user/iam-git/IvoRSS

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main