<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Coding Notions]]></title><description><![CDATA[Thoughts | Stories | Ideas]]></description><link>https://codingnotions.com/</link><generator>Ghost 4.13</generator><lastBuildDate>Sun, 05 Apr 2026 11:29:00 GMT</lastBuildDate><atom:link href="https://codingnotions.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[The Art of Crafting Effective AI Prompts]]></title><description><![CDATA[<p>Is AI really underperforming, or are we not communicating our requests effectively?</p><p>In the age of artificial intelligence, Large Language Models (LLMs) like Claude and GPT have become indispensable tools for businesses and individuals alike. They assist with everything from drafting emails to generating complex reports. Yet, a common frustration</p>]]></description><link>https://codingnotions.com/the-art-of-crafting-effective-ai-prompts/</link><guid isPermaLink="false">67088f5a4969c1026a388b84</guid><dc:creator><![CDATA[Jeremiah Barrar]]></dc:creator><pubDate>Fri, 11 Oct 2024 02:49:19 GMT</pubDate><media:content url="https://codingnotions.com/content/images/2024/10/ai-1.webp" medium="image"/><content:encoded><![CDATA[<img src="https://codingnotions.com/content/images/2024/10/ai-1.webp" alt="The Art of Crafting Effective AI Prompts"><p>Is AI really underperforming, or are we not communicating our requests effectively?</p><p>In the age of artificial intelligence, Large Language Models (LLMs) like Claude and GPT have become indispensable tools for businesses and individuals alike. They assist with everything from drafting emails to generating complex reports. Yet, a common frustration persists: <em>&quot;The AI isn&apos;t giving me the results I want.&quot;</em></p><figure class="kg-card kg-image-card"><img src="https://codingnotions.com/content/images/2024/10/ai.webp" class="kg-image" alt="The Art of Crafting Effective AI Prompts" loading="lazy" width="1024" height="1024" srcset="https://codingnotions.com/content/images/size/w600/2024/10/ai.webp 600w, https://codingnotions.com/content/images/size/w1000/2024/10/ai.webp 1000w, https://codingnotions.com/content/images/2024/10/ai.webp 1024w" sizes="(min-width: 720px) 720px"></figure><p>I&apos;ve noticed a recurring pattern when discussing AI capabilities. Many express dissatisfaction with the outputs they receive. However, when I delve deeper and ask them to articulate what they were trying to achieve, their explanations are often vague, confusing, or even unintelligible. This isn&apos;t a reflection of their intelligence or expertise; rather, it highlights a fundamental challenge in human communication.</p><p><strong>The Human Communication Gap</strong></p><p>Effective communication is a skill that even the most seasoned professionals continually refine. We might have a clear idea in our minds, but translating that into words&#x2014;especially in a way that an AI can understand&#x2014;is another matter entirely. If our peers sometimes struggle to grasp our intentions, it&apos;s reasonable to expect that an AI, which relies strictly on the input it receives, might also struggle.</p><p><strong>Why Prompts Matter</strong></p><p>LLMs are designed to process and generate text based on the input they receive. They don&apos;t infer intent beyond what&apos;s provided. Therefore, the specificity and clarity of your prompt directly influence the quality of the AI&apos;s output.</p><p>For example:</p><ul><li><strong>Vague Prompt:</strong> &quot;Write about marketing.&quot;</li><li><strong>Clear Prompt:</strong> &quot;Compose a 500-word article on the impact of social media marketing on consumer purchasing behavior in the fashion industry.&quot;</li></ul><p>The second prompt provides context, scope, and specific details that guide the AI to produce a more targeted and valuable response.</p><p><strong>Bridging the Gap</strong></p><p>To harness the full potential of AI, we need to become better at communicating our ideas:</p><ol><li><strong>Be Specific:</strong> Clearly state what you want. Include details like length, format, style, and key points to cover.</li><li><strong>Provide Context:</strong> Explain the purpose of the output. Is it for a technical audience, a casual blog, or a formal report?</li><li><strong>Use Simple Language:</strong> Avoid ambiguous terms and jargon unless necessary. If you use specialized terminology, ensure it&apos;s correctly defined.</li><li><strong>Iterate and Refine:</strong> If the output isn&apos;t what you expected, adjust your prompt. Consider what might have been misunderstood and clarify.</li></ol><p><strong>Embracing Responsibility</strong></p><p>It&apos;s easy to blame the tool when we don&apos;t get the desired results. However, AI models are reflections of the input they receive. By taking responsibility for how we communicate with them, we not only get better results but also enhance our own communication skills&#x2014;a benefit that extends beyond interacting with AI.</p><p><strong>Conclusion</strong></p><p>LLMs are powerful allies in productivity and creativity, but like any tool, they require skill to use effectively. By focusing on how we craft our prompts and communicate our ideas, we can unlock their full potential and achieve outcomes that meet or even exceed our expectations.</p><p>Let&apos;s turn the mirror inward and consider how we can improve our interactions with AI. After all, clear communication isn&apos;t just essential for AI&#x2014;it&apos;s a cornerstone of all successful human endeavors.</p>]]></content:encoded></item><item><title><![CDATA[Automated Backup with Rclone and a systemd Timer]]></title><description><![CDATA[<p>There are many options for creating backups. Some solutions have extensive user interfaces or features such as encryption, compression, deduplication and <a href="https://en.wikipedia.org/wiki/Incremental_backup">block level incremental backup</a>. A nice open source example of this is <a href="https://www.duplicati.com/">Duplicati</a>.</p><p>I chose a simple setup with <a href="https://rclone.org/">Rclone</a> because it fits my needs well. Rclone can synchronize</p>]]></description><link>https://codingnotions.com/fully-automated-backup-rclone/</link><guid isPermaLink="false">60ca5332e042c202f7dc4169</guid><dc:creator><![CDATA[Jeremiah Barrar]]></dc:creator><pubDate>Wed, 16 Jun 2021 21:24:13 GMT</pubDate><media:content url="https://codingnotions.com/content/images/2021/06/rclonesystemd.png" medium="image"/><content:encoded><![CDATA[<img src="https://codingnotions.com/content/images/2021/06/rclonesystemd.png" alt="Automated Backup with Rclone and a systemd Timer"><p>There are many options for creating backups. Some solutions have extensive user interfaces or features such as encryption, compression, deduplication and <a href="https://en.wikipedia.org/wiki/Incremental_backup">block level incremental backup</a>. A nice open source example of this is <a href="https://www.duplicati.com/">Duplicati</a>.</p><p>I chose a simple setup with <a href="https://rclone.org/">Rclone</a> because it fits my needs well. Rclone can synchronize files from your computer to any of the common cloud storage providers. It can also perform two way synchronization, allowing you to mirror folders across computers. There are many more features listed on the website.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://codingnotions.com/content/images/2021/06/image-17.png" class="kg-image" alt="Automated Backup with Rclone and a systemd Timer" loading="lazy" width="678" height="61" srcset="https://codingnotions.com/content/images/size/w600/2021/06/image-17.png 600w, https://codingnotions.com/content/images/2021/06/image-17.png 678w"><figcaption>Rclone in action</figcaption></figure><p>For storage, <a href="https://en.wikipedia.org/wiki/Backblaze">Backblaze B2</a> is reliable and secure while still being cheap. Anything under 10GB is free, and 100GB costs 50 cents per month. B2 has a native ability to keep old versions of files, and the retention policy for old versions is highly configurable.</p><figure class="kg-card kg-image-card"><img src="https://codingnotions.com/content/images/2021/06/image-6.png" class="kg-image" alt="Automated Backup with Rclone and a systemd Timer" loading="lazy" width="475" height="153"></figure><p>You can turn on transparent encryption in the bucket settings.</p><figure class="kg-card kg-image-card"><img src="https://codingnotions.com/content/images/2021/06/image-16.png" class="kg-image" alt="Automated Backup with Rclone and a systemd Timer" loading="lazy" width="533" height="331"></figure><p>The third piece of the puzzle is a way to run Rclone on a schedule. There are two great options for this, <a href="https://en.wikipedia.org/wiki/Cron">cron jobs</a> and <a href="https://wiki.archlinux.org/title/Systemd/Timers">systemd timers</a>. A systemd timer is a more modern solution that can be configured to run periodically based on the time that the last backup job finished.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://codingnotions.com/content/images/2021/06/image-18.png" class="kg-image" alt="Automated Backup with Rclone and a systemd Timer" loading="lazy" width="795" height="99" srcset="https://codingnotions.com/content/images/size/w600/2021/06/image-18.png 600w, https://codingnotions.com/content/images/2021/06/image-18.png 795w" sizes="(min-width: 720px) 720px"><figcaption>sudo systemctl status rclone.timer</figcaption></figure><p>These services make setup and usage simple while still having all the features I want:</p><ul><li>Reliability</li><li>Security</li><li>Automation</li><li>Low cost</li><li>Ability to browse backup files from any phone or computer</li></ul><h2 id="setup-instructions">Setup Instructions</h2><p>First, install Rclone. It can be found in most package managers. Then you&apos;ll have to configure your remote. If you choose Backblaze B2, create a bucket and generate a new application key to use with Rclone. Add the remote with <a href="https://rclone.org/commands/rclone_config/">rclone config</a>.</p><p>Next, create a backup script that runs Rclone and specifies the folders you want to back up. I placed my script at <code>/usr/local/bin/rclone-backup.sh</code></p><figure class="kg-card kg-code-card"><pre><code class="language-bash">#!/bin/bash

if [[ &quot;`pidof -x $(basename $0) -o %PPID`&quot; ]]; then
    echo &quot;rclone-backup.sh is already running&quot;
    exit;
fi

rclone sync -v --fast-list --transfers 30 /home/jeremiah/Documents/ b2:backup-bucket-name/Documents/

rclone sync -v --fast-list --transfers 30 --exclude node_modules/ --exclude .git/ /home/jeremiah/Projects/ b2:backup-bucket-name/Projects/</code></pre><figcaption>/usr/local/bin/rclone-backup.sh</figcaption></figure><p>Give the script executable permissions with <code>chmod +x</code>. Don&apos;t forget to test your script when you think it&apos;s ready.</p><p>Finally, you&apos;ll need to set up systemd. Create a systemd service file at <code>/etc/systemd/system/rclone.service</code></p><figure class="kg-card kg-code-card"><pre><code class="language-makefile">[Unit]
Description=RClone Backup

[Service]
Type=simple
ExecStart=/usr/local/bin/rclone-backup.sh</code></pre><figcaption>/etc/systemd/system/rclone.service</figcaption></figure><p>Then create a timer at <code>/etc/systemd/system/rclone.timer</code>. Further documentation can be found <a href="https://www.freedesktop.org/software/systemd/man/systemd.timer.html#">here</a>.</p><figure class="kg-card kg-code-card"><pre><code class="language-makefile">[Unit]
Description=RClone Backup Timer

[Timer]
Unit=rclone.service
# Run 15 minutes after boot, since the timer must run at least once
# before OnUnitInactiveSec will trigger
OnBootSec=15m
# Run 15 minutes after rclone.service last finished
OnUnitInactiveSec=15m
# Run once when the timer is first started
OnActiveSec=1s

[Install]
WantedBy=timers.target</code></pre><figcaption>/etc/systemd/system/rclone.timer</figcaption></figure><p>After saving these files, you may need to run:<br><code>sudo systemctl daemon-reload</code></p><p>Make the timer start on boot by enabling the timer:<br><code>sudo systemctl enable rclone.timer</code></p><p>Start the timer now with:<br><code>sudo systemctl start rclone.timer</code></p><p>The backup script should run once and the timer will trigger it again in 15 minutes. You can check the timer status with:<br><code>sudo systemctl status rclone.timer</code></p><pre><code>rclone.timer - RClone Backup Timer
 Loaded: loaded (/etc/systemd/system/rclone.timer;
    enabled; vendor preset: disabled)
 Active: active (waiting)
    since Wed 2021-06-16 12:00:24 PDT; 1h 41min ago
 Trigger: Wed 2021-06-16 13:56:10 PDT; 14min left
 Triggers: rclone.service

systemd[1]: Started RClone Backup Timer.
</code></pre><p>You can also check the status of the service with:<br><code>sudo systemctl status rclone.service</code></p><pre><code>rclone.service - RClone Backup
 Loaded: loaded (/etc/systemd/system/rclone.service; static)
 Active: inactive (dead)
    since Wed 2021-06-16 13:41:10 PDT; 6min ago
 TriggeredBy: rclone.timer
 Process: 51857 ExecStart=/usr/local/bin/rclone-backup.sh
    (code=exited, status=0/SUCCESS)
 Main PID: 51857 (code=exited, status=0/SUCCESS)
 CPU: 1.248s

rclone-backup.sh[51917]: INFO: There was nothing to transfer
rclone-backup.sh[51917]: Checks: 476 / 476, 100%
systemd[1]: rclone.service: Deactivated successfully.
systemd[1]: rclone.service: Consumed 1.248s CPU time</code></pre><p>If you want to manually run a backup at any time, just start the service manually:<br><code>sudo systemctl start rclone.service</code></p>]]></content:encoded></item><item><title><![CDATA[Caching Everything on Cloudflare and Automating Purge]]></title><description><![CDATA[<p>Websites often have content that rarely changes. Cloudflare offers a free service that hosts your content on <a href="https://www.cloudflare.com/network/">200+ servers around the world</a>. As content changes, the cached copies are invalidated and fetched from the origin server. In this way, a website can handle millions of requests at high speed with</p>]]></description><link>https://codingnotions.com/caching-everything-on-cloudflare-and-automating-purge/</link><guid isPermaLink="false">5f7e943a04fddf97d46d17d1</guid><category><![CDATA[Web Development]]></category><category><![CDATA[Cloudflare]]></category><category><![CDATA[Performance]]></category><category><![CDATA[Ghost]]></category><dc:creator><![CDATA[Jeremiah Barrar]]></dc:creator><pubDate>Wed, 07 Oct 2020 16:03:00 GMT</pubDate><media:content url="https://codingnotions.com/content/images/2020/10/cf-logo.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://codingnotions.com/content/images/2020/10/cf-logo.jpg" alt="Caching Everything on Cloudflare and Automating Purge"><p>Websites often have content that rarely changes. Cloudflare offers a free service that hosts your content on <a href="https://www.cloudflare.com/network/">200+ servers around the world</a>. As content changes, the cached copies are invalidated and fetched from the origin server. In this way, a website can handle millions of requests at high speed with a single core origin server.</p><p>There are a few Cloudflare page rules that enable caching of an entire website. The first rule is an exception that allows the admin area to be accessed properly. The second rule tells Cloudflare to cache everything on the edge servers for 30 days.</p><figure class="kg-card kg-image-card kg-width-wide"><img src="https://codingnotions.com/content/images/2020/10/image-4.png" class="kg-image" alt="Caching Everything on Cloudflare and Automating Purge" loading="lazy" width="722" height="166" srcset="https://codingnotions.com/content/images/size/w600/2020/10/image-4.png 600w, https://codingnotions.com/content/images/2020/10/image-4.png 722w"></figure><p>Let&apos;s say content changes on the website, for example a new blog post is published. There&apos;s an easy way to let Cloudflare know so all the edge servers get a new copy of the changed pages. Using the <a href="https://api.cloudflare.com/#zone-purge-files-by-url">Purge Files by URL API method </a>it&apos;s possible to tell Cloudflare to expire the cache for a single URL or multiple URLs. Each page on the website is slightly slower on the first page load but the page will be cached from then on. In the case of a blog post, the homepage URL, author URL and tag URLs would need to be purged. Depending on your web stack it might be straightforward to implement. On Ghost blogging software it can be done using multiple webhooks such as <code>post.added</code>, <a href="https://github.com/TryGhost/Ghost/tree/baa81188935dd8db935894b9bf7aeb3fd600c719/core/server/services/webhooks">Ghost will send all the URLs</a> you need in the request. If you want to easily inspect what is sent to the webhook endpoint I recommend <a href="https://webhook.site/">https://webhook.site/</a>.</p><p>For the sake of simplicity I used the Purge All Files method. My website is small enough and the server can handle the load, so it doesn&apos;t really matter if the entire cache is purged. I created a webhook in Ghost that fires when anything on the site changes, this particular webhook does not send any data about what changed. For the webhook endpoint I set up a simple Cloudflare worker which is included in the free plan. When the webhook fires, the worker gets a request and then sends a request to the Cloudflare API. Here&apos;s the Cloudflare worker code I used <a href="https://gist.github.com/vdbelt/20f116236d2ebffa92f131e679c0551a">based off of the code I found here</a>:</p><pre><code class="language-js">addEventListener(&apos;fetch&apos;, event =&gt; {

    event.respondWith(purgeCache(event.request))

})

async function purgeCache(request) {

    const url = new URL(request.url)
    const zone = url.searchParams.get(&apos;zone&apos;)
    
    let zoneIdValidated = (new RegExp(&quot;^([a-z0-9]{32})$&quot;)).test(zone)
    let keyValidated = (new RegExp(&quot;^SECRET_KEY$&quot;)).test(url.searchParams.get(&apos;key&apos;))

    if (!keyValidated) {
        return new Response(&apos;Invalid auth&apos;, {
          status: 500
        })
    }

    if (!zoneIdValidated) {
        return new Response(&apos;Invalid Zone ID&apos;, {
          status: 500
        })
    }

    const data = {
        method: &apos;POST&apos;,
        headers: {
            &apos;Content-Type&apos;: &apos;application/json&apos;,
            &apos;Authorization&apos;: &apos;Bearer CLOUDFLARE_API_TOKEN_WITH_PURGE_PERMISSION&apos;
        },
        body: &apos;{&quot;purge_everything&quot;:true}&apos;
    }
    
    const response = await fetch(
        &apos;https://api.cloudflare.com/client/v4/zones/&apos; + zone + &apos;/purge_cache&apos;,
        data
    )

    return response
}</code></pre><p>The webhook endpoint will look like this: <a href="https://workername.yoursubdomain.workers.dev/?key=SECRET_KEY&amp;zone=CLOUDFLARE_ZONE"><code>https://workername.yoursubdomain.workers.dev/?key=SECRET_KEY&amp;zone=CLOUDFLARE_ZONE</code></a> </p><p>You&apos;ll have to create a new integration in the Ghost admin area:</p><figure class="kg-card kg-image-card kg-width-wide"><img src="https://codingnotions.com/content/images/2020/10/image-3.png" class="kg-image" alt="Caching Everything on Cloudflare and Automating Purge" loading="lazy" width="1052" height="614" srcset="https://codingnotions.com/content/images/size/w600/2020/10/image-3.png 600w, https://codingnotions.com/content/images/size/w1000/2020/10/image-3.png 1000w, https://codingnotions.com/content/images/2020/10/image-3.png 1052w"></figure><p>Propagation should take around 10 seconds after site content changes.</p><p><strong>Final Thoughts</strong></p><ul><li>It would be interesting to implement a Ghost plugin that communicates directly with the Cloudflare API and purges all the relevant URLs.</li><li>Ghost is open source and the &quot;Site changed&quot; webhook could be updated so it sends data about what changed. Then the Cloudflare worker could identify what needs to be purged.</li><li>Any updates to CSS or JavaScript files will require a manual purge, a separate server process that watches for file changes could trigger the Cloudflare worker to purge the cache or call the API method directly. It wouldn&apos;t be too hard to implement in Node, and Ghost is already running on Node.</li></ul>]]></content:encoded></item><item><title><![CDATA[Creating a Smart Air Quality Sensor]]></title><description><![CDATA[<p>The <a href="https://www.washingtonpost.com/weather/2020/09/30/western-wildfire-nasa-satellite/">west coast fires of September 2020</a> produced massive amounts of smoke and air pollution rose to dangerous levels. I stumbled upon the <a href="https://www2.purpleair.com/collections/air-quality-sensors/products/purpleair-pa-ii">PurpleAir PA-II</a> air quality sensor which was providing much of the state-wide air quality data. Internally it uses a PMS5003 and I picked one up for $20</p>]]></description><link>https://codingnotions.com/creating-a-wifi-air-quality-sensor/</link><guid isPermaLink="false">5f66d0dbb96b4c7639bb7fcd</guid><category><![CDATA[hardware]]></category><category><![CDATA[sensors]]></category><category><![CDATA[arduino]]></category><category><![CDATA[iot]]></category><dc:creator><![CDATA[Jeremiah Barrar]]></dc:creator><pubDate>Fri, 02 Oct 2020 16:32:00 GMT</pubDate><media:content url="https://codingnotions.com/content/images/2020/10/aqs-1.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://codingnotions.com/content/images/2020/10/aqs-1.jpg" alt="Creating a Smart Air Quality Sensor"><p>The <a href="https://www.washingtonpost.com/weather/2020/09/30/western-wildfire-nasa-satellite/">west coast fires of September 2020</a> produced massive amounts of smoke and air pollution rose to dangerous levels. I stumbled upon the <a href="https://www2.purpleair.com/collections/air-quality-sensors/products/purpleair-pa-ii">PurpleAir PA-II</a> air quality sensor which was providing much of the state-wide air quality data. Internally it uses a PMS5003 and I picked one up for $20 on eBay.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://codingnotions.com/content/images/2020/10/aqs.jpg" class="kg-image" alt="Creating a Smart Air Quality Sensor" loading="lazy" width="449" height="384"><figcaption>&#xA9; adafruit.com</figcaption></figure><p>The goal of this project was to create a internet connected air quality sensor that records data in real time. This sensor was also used to test the effectiveness of a cheap non-HEPA filter and a true HEPA filter. The cheap filter appeared to work quite well for this type of pollution.</p><p>The sensor can detect particles down to 0.3 microns, it sends data over serial. Some models support I2C as well. There are many libraries that can translate the binary data, I ended up using <a href="https://github.com/adafruit/Adafruit_PM25AQI">Adafruit_PM25AQI</a>. I used only PM1.0 and PM2.5 data since PM10 <a href="https://aqicn.org/sensor/pms5003-7003/">seems to be extrapolated</a> from PM2.5 on this device.</p><p>The sensor is connected to an <a href="https://en.wikipedia.org/wiki/ESP32">ESP32</a> which sends data over Wi-Fi to a cloud server running node and MySQL. As data is received the node app records timestamped data into a database. Node can update graphs in real time using a WebSocket connection to a browser. Historical data can been viewed as well.</p><p>The code for the ESP32 is simple, every 5 seconds data is read and data is sent to the node endpoint. A key is used to prevent false data and identify the device. </p><pre><code class="language-c">#include &quot;Adafruit_PM25AQI.h&quot;
#include &lt;WiFi.h&gt;
#include &lt;HTTPClient.h&gt;

const char* ssid = &quot;SSID&quot;;
const char* password = &quot;PASSWORD&quot;;

Adafruit_PM25AQI aqi = Adafruit_PM25AQI();

void setup() {
  Serial.begin(115200);
  while (!Serial) delay(10);
  delay(1000);
  WiFi.begin(ssid, password);
  Serial.println(&quot;Connecting to wifi&quot;);
  while(WiFi.status() != WL_CONNECTED) {
    delay(500);
  }
  Serial.print(&quot;Connected to WiFi, IP Address: &quot;);
  Serial.println(WiFi.localIP());

  Serial2.begin(9600);

  if (!aqi.begin_UART(&amp;Serial2)) {
    Serial.println(&quot;Could not find PM 2.5 sensor&quot;);
  }
}

void loop() {
  PM25_AQI_Data data;
  if (! aqi.read(&amp;data)) {
    delay(500);
    return;
  }
  if(WiFi.status() == WL_CONNECTED){
    HTTPClient http;
    http.begin(&quot;https://example.com/aqi/sendData&quot;);
    http.addHeader(&quot;Content-Type&quot;, &quot;application/x-www-form-urlencoded&quot;);
    String httpRequestData = &quot;key=gj29xh3ngfw32pz84ns&quot;
    + &quot;&amp;pm1=&quot; + String(data.pm10_env)
    + &quot;&amp;pm25=&quot; + String(data.pm25_env);
    int httpResponseCode = http.POST(httpRequestData);
    Serial.print(&quot;HTTP Response code: &quot;);
    Serial.println(httpResponseCode);
    http.end();
  }
  else {
    Serial.println(&quot;WiFi Disconnected&quot;);
  }

  delay(5000);
}</code></pre><p>To be continued</p>]]></content:encoded></item></channel></rss>