The New Year

Now if only I could find out how to spend this Clubcard voucher I could have a belated New Year’s out off office, on my own special company bash. Seems both previous attempts have ended in “Not this product range”, and “It’s been used” and other fabulous excuses from the gobshites. I did get a I don’t know, and call a “busy” helpline as they also seem to be a fan of draining mobile batteries so you can’t use the Clubcard feature in the app when your phone goes off.

Alcatel Longer Term

So it does do the WiFi network “sign in”, but not always (with no listed error), and took an obsurd “got to do a spy search” amount of time to agree to format the recycled SD card. … Google picture uploads will be charging from 1st June. Coincidence or ET finger plug?

An Open Standard for Large Event COVID Passports?

The POX Algorithm RFC. How to show an auth token when you have privacy but no booking or other door duty. The phone occluded xenomorph algorithm. A complex cypher to protect data at all points in transmission. What really gets shown is an event-specific checksum verify on some encrypted data with can be further queried by a provider (such as the NHS) to obtain validity and scope for event purpose on a statistical check basis to reduce server traffic load and focus on hot areas.

At 2953 bytes of data capacity in a QR barcode (23624 bits) there is enough scope for a double signature and some relevant data in escrow for falsification auditing. The following data layers are relevant with keys in between.

  • Verify credential entry VCE (the blind of public record customs inquiries)
    • validity decrypt key (event private key part) VDK QR
  • Door event transit DET (the over the shoulder mutable) QR
    • event encrypt key (event public key) EEK QR
  • Phone independent ephemeral PIE (the for me check)
  • A public blockchain signed hashed issue SHI (the public record) QR
    • authority signature keys (the body responsible for a trace of falsifications)
    • hashed phone number key (symmetric cypher)
    • record blind key (when combined with the event private key part makes the effective private key. Kept secret from the event)
    • confidentiality key (database to publication network security layer)
  • Actual data record ADR (the medical facts)

Various keys are required but covering the QR codes needed is perhaps better.

  • The manager VDK QR (given to the door manager)
  • The issue SHI QR (given by the provider)
  • The event EEK QR (posted online or outside the event)
  • The entry DET QR (made for the bouncer to scan)

At the point of issue, there may be a required pseudo-event to check that all is working well. The audit provider or provider (such as the NHS) has enough data on a valid VCE to call the user and the event in a conference call. Does the credential holder answer to speak to an echoing bouncer? Does the provider send a text?

A New Paper on Computation and Application

https://www.amazon.co.uk/Pipeline-Cache-Big-RISC-Computational-ebook/dp/B07XY9RSHH/ref=sr_1_1?keywords=pipeline+cache+big+risc&qid=1568807888&sr=8-1 is a nice paper on some computation issues, and eventually covers some politics and vitamin biochemistry. Not a fan? Still letting your biome let you shout at the bad people not feeding your hunger?

Shovel in the gammon all you want, and load it up with chips as a little survivor from ancient times takes advantage of the modern high carb diet and digs a hole for you.

Current Contracts

My current contract is interesting work, and only spoiled by the accounting trend of pay everybody as late as possible. It looks as though I’ll up my rates to account and manage invoice factorization. It does no good to act as a credit source to a company which has limited status, and yet either has the resources to pay in time, or is miserly in the period before receivership. Either way such companies are not my friend, and no good to be associated with. Credit control …

Well a bit delayed, but payments are arriving, and I’m investing in some electronic product design.

Implementation of Digital Audio filters

An interesting experience. The choice of FIR or IIR is the most primary. As the filtering is modelling classic filters, the shorter coefficient varieties of IIR are the best choice for me. The fact of an infinite impulse response is not of concern with a continuous stream of data, and coefficient rounding is not really an issue when using doubles. IIR also has the advantage of an easy Sallen-Key implementation, due to the subtraction and re-adding of the feedback component, with a very simple CR processing.

The most interesting choices are to do with the anti-alias filtering, as the interpolation filter, on up-sampling is an easy choice. As the ear is not really responsive to phase, all the effort should be on the pass band response levels, and a good stop band non response. A Legendre or Butterworth are the candidates. The concept of a characteristic sound enters the design process at this point, as the cascading of SK filter sections is conceptually useful to improve the -6 dB response at cut off. This is a trade off of 20 kHz to 22.05 kHz in the alias pass band, and greater attenuation in the above 22.05 kHz infinite stop desire. The slight greater desire of alias attenuation above pass band maximal flatness (for audio harmony) implies the Legendre filter is better for the purpose than Butterworth.

In the end, the final choice is one of convenience. and a 9th order filter was decided upon, with 4 times oversampling. The use of 4 times oversampling instead of 8 times oversampling increases the alias by an octave reduction. This fact under the assumption of at least a linear reduction in the amplitude of the frequency of the generator of an alias frequency, with frequency increase, just requires a -12 dB extra gain reduction in the alias filter for an effective equivalence to 8 times oversampling (the up to and the reflection back down to 6 + 6). The amount of GHz processing also halves. These facts then become constructive in the design, with the bulk alias close to the cut off, and the minor reflected alias-alias limit, not being too relevant to overall alias inharmonic distortion.

A triple chain of 3 pole Legendre filter sections is the decided design. The approximate -9 dB at the corner, allows for slightly shifting up the cut off and still maintaining a very effective stop band. Code reuse also aids in the I-cache usage for CPU effective use.  A single 3 pole Legendre is the interpolation up sample filter. The roll off for not using Butterworth does cut some high frequency content from the maximally flat, hence the concept of maximally flat, but it out performs a Bessel filter in this regard. It’s not as though a phasor or flanger needs to operate almost perfectly in the alias band.

Perhaps there is improvement to be made in the up sampling filter, by post up sample 88.2 kHz noise shaped injection to eliminate all error at 44.1 kHz. This may have a potential advantage to map the alias noise into the low frequencies, instead of encroaching from the higher frequencies to the lower, and for creating the alias as a reduction in signal to noise, instead of at certain inharmonic peaks. The main issue with this is the 44.1 kHz wave fundamental, seen as the amplitude ring modulation of the injected phase noise, by the 44.1 kHz stepped waveform between samples input. The 88.2 kHz “carrier” and the sidebands are higher in frequency, and of the same amplitude magnitude.

But as this is following for no 44.1 kHz error, the 88.2 kHz and sidebands are the induced noise, the magnitude of which is of the order of 1 octave up from the -3 dB roll at the corner, plus approximately the octave for a 3 pole filter, or about 36 dB cut of a signal 3/4 of the input amplitude. I’d estimate about -37 dB at 88.1 kHz, and -19 dB at 44.1 kHz. Post processing with a 9 pole filter, provides an extra -54 dB on down sampling, for an estimate of around -73 dB or greater on the noise. That would be about 12 bit resolution at 44.1 kHz increasing with frequency. All estimates, likely errors, but in general not a good idea from first principals. Given that the 44.1 kHz content would be very small though post the interpolation filter, -73 dB down from this would be good, although I don’t think achievable in a sensible manor.

Using the last filtered sample in as the reference for the present sample filtered in as a base line, the signal at 22.05 kHz would be smoothed. It would have a notch filter effect, by injecting quantization offset ringing noise at 88.2 kHz to cancel 22.05 kHz. The notch would likely extend down in frequency for maybe -6 dB at about 11 kHz. Perhaps in the end it is just better to subtract the multiplied difference between two up sample filters using different sinc spreading of a 1000 and a 1100 sample occupancy zero inter fill. Subtracting the alternates up conversion delta as it were.

There is potentially also an argument for having a second order section with damping factor near 0.68 and corner 22.05 kHz to achieve some normalisation from sinc up-sampling. This adds in an amount of Q such as to peak the filter cancelling the sinc droop, which would be about 3% at 4 times oversampling.

EDIT: Some of you may have noticed that the required frequencies for stable filtering are too high at 4 times oversampling. So unfortunate for the CPU load an 8 times oversample has to be used. The sinc error is less than 1% at this oversample, but still corrected in a similar way, and a benefit of 2 extra poles. Following this by a 0.1 dB 3 pole Chebyshev high pass which has been inverted, gives a reasonable 5 pole up sampling filter. The down sampling filter for code efficiency is a triple instance of the sample inverse Chebyshev, with the corner frequencies slightly offset to produce more individual zeros, and some spreading of the “ringing”. These 9 poles are enough to get the stop band ripple to be lower than a 16 bit resolution. Odd order inverse Chebyshev are essential for the reflected spectra to be continually decreasing in amplitude.

How Moving to Outlook Destroys Your Gmail

Just testing out office 365. Surprising features in Outlook include unsubscribing folders, still makes them delete-able. Which is very surprising as you’d likely unsubscribe to a folder to not consume local space and download, and just want it left alone on the server. But alas the default to spam fill everything, with no subscribe wizard, and a bad default action on unsubscribed folders, and you’d think Microsoft never ever tested this thing in the real world of reality.

I have been thinking of a short cost saving by an Azure migration, but this has to be tested before committal. The last thing I need is to find that Microsoft has some limits on root servers. I find my current provider good, and would not want to end up between a rock and a hard place.

Tax in a Technological Economy

I decided to produce this free work covering a subject certain to almost all, as the saying goes death and taxes. As director of this company I have to hold an opinion on such things, be sure the government does. I thought it best to open the internal decision stream to customers and subcontractors and the wider public, so that such issues such as am I buying services from the latest tax haven, or is there and extra 20% effective going to the cause of central. So this company will pay all taxes due, and perform no self manufactured tax breaks by diverting funds into holding companies with surprisingly opaque financials. As to the issuance of dividend, it’s only an issue of the differential between personal income tax and corporation tax. Hence if one is higher or lower, the government intent is to suggest that (well depending on it being a break or a punitive), one or another should be done. To comply with this differential all profit be either corporation tax or full issue to dividend the same effective amount, to the effect of fulfilling the intentional supra directive of governance. As a maximal to government policy can have no detriment beyond the sense of the government option to issue rebate, a matter of two sums and a max holds the basis for the return. This will be issued into the record as a short form expression of this intent is formed.

The arguments against this are not relevant for small companies, but are interesting to me. The financial weight of a company in a sense votes on the validity of government policy, and also via the central limit theorem can, but not always does add some stability to the bumps of government policy input at very specific points such as the budget. Weather this share based meta vote is valid in ideal is not the point. It is fact. In a way, I felt the need to inject situation as with a very large company implementation of a maximal extreme policy would lead to potential PLC stock instability, and more critical divergence and resulting accountability of government policy.

In the days where director’s bread runs thin, the prices really could be justified to be higher, but in the situation of capital demand, the lower but liveable introductory offer is the sale, sale, sale of it all. The capital buffer of a large company will always outperform in a sale negative cost battle. The only option in such a situation is quality contract delivery at a reasonable cost. This is innately a consequence of pay by the hour systems. All known good clients know this. Optimization efficiency, and automation are exemplified by the phone in your pocket. Robots, robots, robots. Some estimates place 20% redundancy through automation in lifetime as conservative. The effective replacement of pay by the hour occupational replacement is yet not an automated provider. Robot living allowance is not a joke, Although darkly funny.

As a digital business, with surprisingly analog books, many computer based Americanisms and longer term goals of electrical production, KRT has to be aware of future customers, and not just marketing for now. Holding of risk based on past operational equipment when the economic model is to design and sell services for the future deficit load, and not a present credit bubble, is the game of the future. At present KRT has on occasion used contractors, as employee costs are difficult to justify. Not the work risk, but the contract risk. Continuous jobs, even technology based ones, need a steady supply of work contracts coming in. The larger the contract the more unstable it is to renegotiation, and need of a cash buffer. Contractors are much easier. They are on in effect though zero hour contracts as the lingo goes, although this is more like at least some hours but no repeat business guarantee. This then leads on to the obvious development of customer service, to achieve a conversion and repeat business rate. At KRT, the model is for quality contract service. This does not involve a cold call sales pitch, as I receive enough of those already. They are good for occupying the time of those who do not need the service, with an occasional contract win. Delivering services to people who are in need of the service, without harassment for further business, is part of the service. Good customers will always know where to look again, and are good at recommendation. The most effective strategy then from the KRT point of view is the development of R&D rigs and insight concept projects. Interesting technology becomes worth looking in on.

The updates will continue where relevant.

The Inquiry Forms Emailing Now Works

Just set up sendmail on the server using these instructions. Now it’s possible to send inquiries. There will also be other forms as and when relevant. The configuration had an extra step of using a cloud file which behaves as a master for sendmail.mc, but this was no big deal. It all took less than ten minutes. A minor complexity is now to forward the response email to somewhere useful to tie up that loose end for later.