Your leaking thatched hut during the restoration of a pre-Enlightenment state.

 

Hello, my name is Judas Gutenberg and this is my blaag (pronounced as you would the vomit noise "hyroop-bleuach").



links

decay & ruin
Biosphere II
Chernobyl
dead malls
Detroit
Irving housing

got that wrong
Paleofuture.com

appropriate tech
Arduino μcontrollers
Backwoods Home
Fractal antenna

fun social media stuff


Like asecular.com
(nobody does!)

Like my brownhouse:
   smoothing out the jags
Wednesday, July 31 2024
The replacement 8TB hard drive arrived today, and I immediatly tried swapping in its controller board on my old failing 8TB drive, which looked to be nearly identical. The drive powered up and made some sounds but then eventually gave up. There must've been some sort of incompatibility, and it was clear this combination of equipment would never work. So I went to plan B: putting the failing hard drive in the freezer and getting it nice and frosty and then attempting to recover what I could of what was new on it (stuff added in the past three years) onto the new 8TB drive. This worked much better than expected. Once cooled down to freezer temperatures, the failing drive worked for about an hour before Woodchuck would lose its connection to it. So then I'd just put it in the freezer again for a couple hours and recover some more files. When I'd bring the frozen hard drive out into the humid summer air and it would immediately become covered with droplets of condensation, but evidently this didn't affect its operation. Of course, now I'll be using another Seagate Barracuda Compute 8TB hard drive despite what the previous hard drive did to my feelings about the entire Seagate brand, particularly this model. I'll probably be getting an even bigger hard drive before too long from a different manufacturer so I put this replacement drive into early retirement.

I managed to find yet more things to do with my ESP8266 Remote Control system. Today I added a mechanism so that someone with access to a tenant can create a link and send that to someone who, if they create a user with that link, will be attached to the tenant. I also made it so someone creating a user with no such tenant connection gets a tenant automatically created and linked to their user. I still need to figure out a way to allow users who are not super users to edit information about their tenant, run pre-built reports, and that sort of thing.
One lingering issue is the jaggedness of the battery percentage line on the inverter data graph. I've wanted smoother data, one where the values can be decimals between integer percentages, and up until today the focus was on coming up with my own independent means of determining the remaining charge left in the battery. But there are just too many variables influencing that value, and some of them are highly non-linear, meaning I would have to somehow experimentally come up with a complex function in order to get this right. It would also depend on lookup tables, since all batteries lose capacity over time (meaning that data indicating a specific percentage today probably would indicate a different percentage five years from now). I asked ChatGPT about how I might do this, and its answer was so non-specific and devoid of actionable steps that I decided not to pursue it any further. As I thought more about it, I realized that integer percentages were fine for all my needs (that is, checking to see how full the battery and algorithmically determining when to turn loads on and off) with one exception: the graph. If I could just made the plot of battery percentage less jagged, that was all I really needed.
So I dove into the documentation of Chart.JS, the Javascript library I use to draw my graphs. It has a feature called "tension" that makes the connection between dots on a graph more fluid. So I tried that and was disappointed by the results. It turned out that I was failing to take into account that, at the level of time-granularity I was storing data, the battery percentage values for any particular percentage would be numerous even when battery percentage was changing rapidly. This meant that the any tricks with drawing lines between points wouldn't work, since there would be, say, a sequence of 15 data points at 89 followed by a sequence of 15 data points at 90. The only place for smoothing out the curve would be between the last data point at 89 and the first at 90, and that would have to be a vertical line, since nothing else would make any sense. The steps were baked into the data.
This insight led me into a very productive conversation with ChatGPT, the excerpts of which you can see below:

Me:
let's say i have an array of integers and would like to produce an array of decimal values with two decimal places of precision with the harsh transitions from one integer to the next smoothed out. can you make me a javascript function to do this?

ChatGPT:

To smooth out harsh transitions between integers in an array and produce a new array of decimal values with two decimal places of precision, you can use a simple linear interpolation or smoothing technique. One way to achieve this is by averaging each value with its neighbors, which helps to reduce sharp transitions.

Here's a JavaScript function that takes an array of integers and returns a new array of smoothed decimal values:

...

Customization You can adjust the smoothing technique by using different methods, such as applying a weighted average or using a more sophisticated smoothing algorithm like Gaussian smoothing. If the first and last values should also be smoothed, you can consider alternative boundary handling methods, such as extending the array or using different averaging schemes.

Me:

that's good, but the window it is looking at in the data (three data points wide) is too narrow. how about a version of this function where that window size can be passed in as a second parameter?

ChatGPT:

Certainly! You can extend the function to accept a windowSize parameter that defines the number of neighboring elements to consider on either side of each element when calculating the smoothed value. This will provide greater flexibility in smoothing over a wider range of data points.

Here's the updated function:

...

Me:

that works great! now i'm wondering if there is a way, when producing this window of values and calculating the average, if greater weight could be given to the values at the center of the window

ChatGPT:

Yes, you can give greater weight to the values at the center of the window by using a weighted average approach. One way to achieve this is to assign weights that decrease as you move away from the center of the window. A common approach is to use a Gaussian (normal) distribution for the weights, but for simplicity, we can use a linear or triangular weighting method.

Here's an updated function that uses linear weights, where the center value has the highest weight and the weights decrease linearly towards the edges of the window:

...

Me:

an unexpected problem: at the ends of the graph, the averages seem to drop off due to lack of data in the window (because the window reaches outside the data). what can we do about that?

ChatGPT:

The drop-off at the ends of the graph is due to the reduced number of data points available for averaging within the window. To address this, you can modify the calculation of the average so that it only considers the actual weights and data points present within the bounds of the array. This means adjusting the weight sum for edge cases, ensuring that the total weights used in the calculation reflect only the valid weights.

Here's an updated version of the function that accounts for this edge case:

...

Me:

that's great -- but i'm wondering now if we can pass in how large the disproportionality of the weight that we can give to the middle of the data window should be

ChatGPT:

You can modify the function to accept a weightFactor parameter that determines how disproportionately the weight should be distributed toward the center of the window. The weightFactor will allow you to adjust the steepness of the weighting, making the center values more or less influential compared to those at the edges.

Here's how you could implement it:

...


I didn't end up making use of that last feature, since (at least with my data), values larger than one tended to make it as though the window size had shrunk. I found that with my battery percentage data, a window size of 18 was ideal for perfectly smoothing out the jags without obscuring what the data was actually doing.

I'd taken a recreational 150 mg dose of pseudoephedrine this morning, and, when combined with diphenhydramine (but no alcohol), it compelled me to stay up fairly late.


I took Charlotte for another walk up the Chamomile Headwaters Trail and then back home via the Stick Trail. This is the stone wall south of the Chamomile. Click to enlarge.


Closer to west end of that wall. Click to enlarge.


Charlotte near the Chamomile crossing on the Stick Trail, photographed from near the stone wall. Click to enlarge.


For linking purposes this article's URL is:
http://asecular.com/blog.php?240731

feedback
previous | next