Author Topic: Background Downloading from the net  (Read 3728 times)

0 Members and 1 Guest are viewing this topic.

Offline SMcNeill

  • QB64 Developer
  • Forum Resident
  • Posts: 3972
    • View Profile
    • Steve’s QB64 Archive Forum
Background Downloading from the net
« on: February 24, 2019, 05:22:47 am »
Code: QB64: [Select]
  1. CONST NumberOfFilesToDownload = 1 'Set These for the files you want to continiously download in the background
  2. DIM SHARED DownloadList(NumberOfFilesToDownload) AS STRING
  3. DownloadList(1) = "http://rss.wunderground.com/q/zmw:24138.1.99999"
  4.  
  5.  
  6. DIM SHARED Maintimer AS LONG, Downloadtimer AS LONG
  7. DIM SHARED KeepDownloading(NumberOfFilesToDownload) AS LONG
  8. DIM SHARED DownloadHandle(NumberOfFilesToDownload) AS LONG
  9. DIM SHARED DownloadData(NumberOfFilesToDownload) AS STRING
  10.  
  11.  
  12.  
  13.  
  14.  
  15.  
  16. Maintimer = _FREETIMER
  17. Downloadtimer = _FREETIMER
  18.  
  19. ON TIMER(Maintimer, 6) RestartDownload 'This is the time between background refreshes that we want from our webpages/RSS feeds
  20. ON TIMER(Downloadtimer, 0.05) BackgroundUpdate 'This is the timer so we download our information in the background.  It's currently set to poll a remote server 20 times per second
  21.  
  22.  
  23. TIMER(Maintimer) ON 'NOTE, when a timer is turned ON, it won't start until AFTER the specified amount of time has passed
  24. InitDownloads 'To bypass this initial wait, we manually call InitDownloads ourselves to start the process
  25.  
  26.  
  27.     _LIMIT 60
  28.     IF LEN(DownloadData(1)) = 0 THEN TimesRefreshed = TimesRefreshed + 1
  29.     'Notice TimesRefreshed WON'T increase by one each refresh.
  30.     'Our mainloop is making 60 passes per second (as per our LIMIT)
  31.     'and the download process only runs 20 times a second.
  32.     'There's also the small lag to account for from when we first send data to open the remote server
  33.     'to when we get our first data back.
  34.     '
  35.     'For my test on my PC, this number increments by about 20 (+/-5) each refresh
  36.     LOCATE 1, 1: PRINT LEN(DownloadData(1)), TimesRefreshed
  37.  
  38.  
  39.  
  40. SUB RestartDownload 'restart the download timer every 60 seconds to refresh download$ in the background.
  41.     FOR i = 1 TO NumberOfFilesToDownload 'clear the old files
  42.         KeepDownloading(i) = 0
  43.         DownloadData(i) = ""
  44.         CLOSE DownloadHandle(i) 'close any old connections (if the remote server hasn't already)
  45.     NEXT
  46.     InitDownloads 'reopen all those closed handles for fresh transfers
  47.  
  48. SUB BackgroundUpdate
  49.     FOR i = 1 TO NumberOfFilesToDownload
  50.         GET #DownloadHandle(i), , a$
  51.         IF a$ <> "" THEN
  52.             DownloadData(i) = DownloadData(i) + a$
  53.         ELSE
  54.             KeepDownloading(i) = KeepDownloading(i) + 1 'Let's track our number of blank responses
  55.         END IF
  56.         IF KeepDownloading(i) > 40 THEN CloseConnection = CloseConnection + 1 'if we get no data from anything for 2 seconds, we stop trying
  57.     NEXT
  58.     IF CloseConnection = NumberOfFilesToDownload THEN TIMER(Downloadtimer) OFF
  59.  
  60. SUB InitDownloads
  61.     FOR i = 1 TO NumberOfFilesToDownload
  62.         DownloadHandle(i) = Opendownload(DownloadList(i))
  63.     NEXT
  64.     BackgroundUpdate 'manually call the background service once so we don't wait for the timer
  65.     TIMER(Downloadtimer) ON
  66.  
  67. FUNCTION Opendownload (url$)
  68.     link$ = url$
  69.     url2$ = RTRIM$(LTRIM$(link$))
  70.     url4$ = RTRIM$(LTRIM$(link$))
  71.     IF LEFT$(UCASE$(url2$), 7) = "HTTP://" THEN url4$ = MID$(url2$, 8)
  72.     x = INSTR(url4$, "/")
  73.     IF x THEN url2$ = LEFT$(url4$, x - 1)
  74.     NewsClient = _OPENCLIENT("TCP/IP:80:" + url2$)
  75.     IF NewsClient = 0 THEN EXIT FUNCTION
  76.     e$ = CHR$(13) + CHR$(10) ' end of line characters
  77.     url3$ = RIGHT$(url4$, LEN(url4$) - x + 1)
  78.     x$ = "GET " + url3$ + " HTTP/1.1" + e$
  79.     x$ = x$ + "Host: " + url2$ + e$ + e$
  80.     PUT #NewsClient, , x$
  81.     Opendownload = NewsClient

Something which I'm always needing to do and yet have never actually sat down and worried about actually working out all the details before:  a method to download files as a background process inside another program.

My latest little clock program grabs information from a RSS feed from the internet -- https://www.qb64.org/forum/index.php?topic=1095.0

In it, the website which I get information from drops a long bomb of information which takes several seconds to download.  Since the program is a clock, it NEEDS to update the screen so the second dial moves correctly, without a visible pause every time the download routine is called.  The current routine inside that program is insufficient for background downloading, and thus I wrote up this little code so I can plug it into it and not have any issues.

I think this little program comments itself well enough so nobody would have any issues understanding how it works, but if you're the least bit confused about any part of it, feel free to ask and I'll help explain whatever I need to.

The demo is set to poll the file information 20 times a second, and do a complete refresh of data every 6 seconds.  (Which is REALLY insane for constantly polling a website for information, like from a RSS feed, which normally only needs to update once every fifteen minutes or so...  Some servers might flag your IP for a DoS and refuse connections if you poll them too often, so be careful with that.  For my needs, currently, I'm just going to refresh the weather data every 10 minutes which my clock runs.  Nobody nowhere should object to that small amount of automated connections...)
https://github.com/SteveMcNeill/Steve64 — A github collection of all things Steve!

Offline Pete

  • Forum Resident
  • Posts: 2361
  • Cuz I sez so, varmint!
    • View Profile
Re: Background Downloading from the net
« Reply #1 on: February 24, 2019, 07:10:11 am »
I usually slave downloads out to wget or curl with shell calls, but I have used _openclient in the past. Neat stuff!

So...

IF LEFT$(UCASE$(url2$), 7) = "HTTP://" THEN url4$ = MID$(url2$, 8)

How about this instead?

IF LEFT$(UCASE$(url2$), 4) = "HTTP" THEN url4$ = MID$(url2$, INSTR(url2$, "://") + 3)

That way, you take care of https and http urls. ;)

Pete

Want to learn how to write code on cave walls? https://www.tapatalk.com/groups/qbasic/qbasic-f1/

Offline SMcNeill

  • QB64 Developer
  • Forum Resident
  • Posts: 3972
    • View Profile
    • Steve’s QB64 Archive Forum
Re: Background Downloading from the net
« Reply #2 on: February 24, 2019, 07:20:05 am »
Https won’t work without jumping through hoops much more complicated than this.  (For one thing, you need to use a different port to start with...)

A quick read on it here: https://biztechmagazine.com/article/2007/07/http-vs-https
https://github.com/SteveMcNeill/Steve64 — A github collection of all things Steve!

Offline Pete

  • Forum Resident
  • Posts: 2361
  • Cuz I sez so, varmint!
    • View Profile
Re: Background Downloading from the net
« Reply #3 on: February 24, 2019, 07:26:02 am »
Ah, it has been awhile. That's probably the reason why I started using Curl and Wget. Wget finally started supporting https and Curl has for some time. It would be great if QB64 could be updated at some point, too. A lot of newer sites are https now... including this one!

Pete
Want to learn how to write code on cave walls? https://www.tapatalk.com/groups/qbasic/qbasic-f1/