Pico ZX Spectrum

I’ve started making my own edits of Pico ZX Spectrum as I found it interesting. A bit of preparation work was done. Like adding a custom port and adding save state routines, a new interrupt mode for some logical sense (finally) on the “Speccy”, and a counter register in the sound chip for video sync from the vertical blanking by observation polling for specific count values. The timing is too tight for beam following.

There’s still more to do. I did a video mode enhancement (256*128 in 8 colours), perhaps a new ROM, architecture extensions and some extra Z80 instructions. 

Gradient Optimization

So if a gradient descent hyper-parameter controlling the learning rate is the usual way, how can this possibly be improved? Considering that in some way the approximation of future gradient alterations is distributed depending on the batch, the stability via an average gives a more stable basis to then infer an accelerated projection of the future descent.

The biggest problem to consider is bound oscillation. When the accelerated projection is magnifying the learning delta to apply such that locality is an asymptotic non-convergent (reverse symmetry in summation acceleration by considering the divergent terms as “merging toward” the first term limit). This then would converge as a metaseries in some instances, but not all. It then becomes essential to scale the approximations by inverse power weighting to make a convergent for highly entropic unstable weights. It may also indicate that weight decomposition may be an effective strategy to obtain a neuron split into the stable (time aligned) and the unstable (time inverted) partitions of a signal.

Assuming the unstable partition has a repellor (opposite to an attractor in chaos), modelling could be used to invert the accelerated projection to the repellor. If the accelerated series is approximated by an integral, the unstable inverse acceleration would perhaps be a reversal of the limits of integration? Or a sign reversal of the limits?

In a sense the splitting of the network into a composition of multiple networks based on partitions related to the number of critical negative signs (or more precisely the number of things that could have negative signs). In this case just 1 sign for a time is like hyper-parameter convergence property. The algorithm after decomposition can then be specifically optimized per partition.  

Pi Pico

So I got a cheap Pi Pico as the postage and packaging were more expensive than adding in a Pi Pico to exceed the minimum free P&P limit. It’s about a 2 MB flash drive with a dual computing core and some simple GPIO. It looks like it can do about 500k sample/s ADC over 3 channels at 12 bits, so some audio project seems like a good idea to try.

It has some Amiga “copper” style coprocessors for IO too, so making a video raster scan is likely easy too. It does have enough power to simulate an 8-bit core with a spare CPU left for other purposes. At 133 Mhz that’s quite an efficient bit of silicon area. A 1980’s super-computer with slightly less vector parallelism and a bigger (smaller) storage media. Bargain!

Moonshine Elliptical

Moonshine Elliptical represents my latest combination of commentary and finding on the massively impressive tome of knowledge about elliptic curve theory (useful in cryptography and generally understanding space and time). Fields of character (the summing of multiplicative identity to equate to the additive identity) two, three and other along with factorization of parts of the world mechanic into finite simple groups (the extended concept of primes of which the primes are just one sequence).

1729

Boltzmann-Fermi-Dirac Colour Charges

It’s a long shot but imagine if you will a gluon made of two halves. The halves can each be drawn from the two “weights” (low and high of a non-zero-sum field) if they broke symmetry and the two “charges” like from the zero-sum field.

Given cancellation and combination, 8 gluons happen.

L+L+, L-L-, L+H-, L-H+, L+H+, L-H-, H+H+, H-H-.

So there’s more green weight “sticky” and the Boltzmann distribution for the half Bose-Einstein as a Fermi-Dirac perhaps. The blue colour perhaps travels less far due to higher “mass” (if it splits), but as the energy input in the strong force makes more gluons at a critical threshold, the further interaction has more energy and a less gluey implementation in blue.

I wonder if the QCD simulation evaluations can take this all into account for better accuracy. I put the two yellow “charges” in there which technically would be massed green, but given the charge +/- cancellation without perhaps LH equality would suggest a kind of neutral weight dipole.

EDIT: As the energy increases moving into “small x QCD” the expansion of colour coverage, to prevent saturation by a UV gluon density catastrophe, the critical temperature is exceeded and the “cooper pair” effect on the half bosons is removed and Pauli colour saturation removes the gluon density within the nucleon. Yes, this paragraph is unproven, but there must be some effect stabilising the UV catastrophe. This would also lead to a cyclic order of colour based on mass expression above the critical temperature for some critically small x.

Minecraft Modding

And so I downloaded Forge for 1.18.1 BETA. I’ve got the basic details down, and hope to convert some tutorial module to 1.18.1 and then set about making some extras to try out. ExactFeather396 is the github repo, and it will develop slowly. I seem to have a Microsoft username.

It seems quite a lot is JSON these days, and sometimes the names of methods change a bit. Hopefully this is not too weird to convert and make into something. Some AI mob redstone thing? Who knows?

EDIT 2022-03-01: Ah, so that’s how potions are made! And the RegistryMap class might come in useful later. I’ll have to make some of the classes final and private or default some of the public classes.

Just analyzing the base code at the moment to make it adaptive and have minimal technical debt. I might abstract off some of the names.

Seems the rendering of Mobs is somewhat complex. So I’ll have to have a look see. A basic Zomie clone reskinned seems easiest. I’ve started on an AI exception mechanism. It starts with BaseCodeException which fires the instict emote() when it reaches the base code, which proxies the actionTry() and actionCatch(BaseCodeException) for consequential instinct.

Ah, exceptions and encapsulation within a RuntimeException to avoid the dreaded can’t override method with throws extension, and having type checking and catching.

EDIT 2022-03-09: So it has been simplified a bit and made more complex. I’m making it so that various Loaded classes via extension are invoked via a static and then an instance of self passed to it allows the invoke dynamic override while avoiding lots of casts to super classes.  This may look more complex but it does allow easier pull request merges by keeping things in seperate files to the mini module level.

Added in a simple potion system I’ve yet to test.

VCV Rack V2

VCV Rack version 2 was released in the new year and the SDK has also been released. I’m making KRTPluginA available for it (as an update 2.25.27), as it is a simple edit update and making some better documentation. I’m in the process of doing the simple API 2 migration. The text to write though is more of a complex decision.

When the phrase “one line description” bumps into a review platform wanting better and perhaps longer description with “\n” literal line splitting (as there is no tool tip wraping). Basically it’s just adding in tool tips and slightly better description and riddance of some spelling mistakes.

So thinking of what I can do for “Z” as I’m going to call it. Something related to some interesting maths so it gives me a reason to read a bit more.

Proper Crimbo

Hi, and greetings from a warm Ikea after leaving the slowly sinking wet, cold boat. Let’s all hope for the “frost” not happening this year with wet feet marine one being a bit shit.

So I got a new battery with a phone around it. The old one had reached the end of service with cold failure imminent. So I got an Alcatel 1 (the cheap dual SIM unlocked one), and it is quite nice for the price, and the battery lasts as it should. I’d recommend it.

It fits a surprising amount in the 8GB, due to android Go and little bloat. You must check if you already have some Go versions installed of some of the old bloatier apps (like Maps). Bits of it are to the cheap end of the market standard, but functional. It is available in other versions with a better camera and more memory, but I don’t need those. A good calculator, YouTube and varous social app lite editions. It still has enough space to keep Chrome, and install Libby for audiobook reading.

I suppose I’ll find out how the settings have been reorganized, and how to enable debug mode. One of the 7 taps on the build number in the system settings. OK.

James Webb

They put a big telescope into space. Perhaps they’ll find some intelligent life. Or maybe the intelligent life will broadcast the bunny ear fingers of the thing galaxy wide for a bit of Crimbo fun.

Ltd. Still and QMK.

Yes the QMK active branch and some news that my accounts are now filed. Zero in/zero out as a boring COVID an low contact availability year. So Ltd. status continues as far as I’m aware. I’ll keep you all informed.

So now the send_unicode_string() function is used for a macro system within the keymap latest coding. This is opposed to how the macro layer emits function key combinations, which is more in line for a tool on the computer handling it. I’ve also added repeated substring compression too. (“\\0” to “\\9”)

So more of a hard baked solution, but does allow more complex multi-character glyphs to be produced instead of just one Unicode code point. So with about 1100 bytes free about, that’s about thirty UTF-8 bytes per key action (over the 37 defined key action macros). Even “\\\\” is defined for emitting an initial backslash, just so backslash can be used as a prefix for more complex macro processing than just print until end of string.

Other macro features like tapping key codes for nested macros? Yes, if the keycode is added to the array so “\\A” will tap the first keycode.

QMK Keyboard Again

Latest 2021-11-22 commit goes for an 10 layer design with the language BQN built in and three further planes of Unicode glyphs. This leaves 2296 bytes of firmware left for further adaptions. Seems the lock key option consumes quite a bit and I don’t need it.

Altering the U’ ‘ defines such that a Unicode glyph is copied between the single quotes would add a Unicode character to the design.

The control iconographs for example:

[IAT] = U'⚠', [IA] = U'⟁', [IB] = U'🗚', [IC] = U'🗐',
[ID] = U'🔖', [IE] = U'🔎', [IF] = U'👍', [IG] = U'🔔',
[IH] = U'⌫', [II] = U'⭾', [IJ] = U'⏎', [IK] = U'⭿',
[IL] = U'📇', [IM] = U'✓', [IN] = U'🗋', [IO] = U'🗁',
[IP] = U'🐧', [IQ] = U'📤', [IR] = U'📥', [IS] = U'💾',
[IT] = U'🌱', [IU] = U'👎', [IV] = U'📋', [IW] = U'🔑',
[IX] = U'🗙', [IY] = U'🗜', [IZ] = U'⎌', [ILBR] = U'⎋',
[IBSL] = U'🌍', [IRBR] = U'☣', [ICAR] = U'⚗', [IUND] = U'☢',
 
Are picked to represent general principals of the control characters in a modern computer environment. Some of them may be difficult to understand at first but for example the last row could be considered ECO/BIO/CHEM/PHYS, on a atomic building.
 
Deciding on the extra control layer glyphs as they don’t have ASCII slots but are possible to type is a bit more complicated. I’ll give them a bit more thought. 
 
There’s an interesting VSCode crash 139 development (SEGMENTATION FAULT) just occurred which shouldn’t be happening but is a crash in the rendering process. Obviously some “bad code” in VSCode?
 
I’ve improved the shift mechanism for some of the extended layers, and filled in the ANSI control code layer to my satisfaction. I’ve finalized the Navigation and Macro layers to final satifaction and added a number lock on the Magenta shift of the Macro Yellow layer.
 
 

K Ring Technologies Unlimited and MOND Galactic Equivalence

Due to the facinating non-working companies house service for email notification for upcomming filings which are needed not working in the COVID period I have miss the accounting date and K Ring Technologies Ltd, is to be struck off the companies register. All business is therefore to become sole trader until such time as sense resumes.

But apart from that I’ve extending the quantum uncertainty in a gravitational field idea and come up with

1/(r-r^2.dr)^2-1/(r+r^2.dr)^2

As a dipole expansive explination for dark matter and thruough the singularity of the first term an eventual repulsive dark energy kind of force. In the dipole limit in a galaxy for example the force will aproximate 1/r and so is effectively a MOND on the small galactic scale.

Parse Buffer Overflows? Dark Priorities.

Sounds like such fun. An irremovable or a point update fix on the press? https://github.com/jackokring/majar/blob/master/src/uk/co/kring/kodek/Generator.java sounds like fun too. Choices, choices? Amplified radial uncertainty of Δr.GMm.Δt≤ℏ.r2/2 was kind of the order of last night. Is it dark matter? Is tangential uncertainty in the same respect part of dark energy? The radial uncertainty in a sure instant of time, and the potential gravitational energy? A net inward force congruent with dark energy?

And a tangential version of the squared hypotenuse of radius and tangential uncertainty of radius resultant? That leads to a reduction of gravity at a large radius and is more like dark energy. More evidence for a spectrum of uncertainty amount hence the “less than equals” being simplistic on an actuality?

Oh, no I’ll have to investigate the last GET/POST before errors … how boring (last time an Indian) … guess who?

The Small Big G and Why Gravity?

As G the gravitational constant is small compared to other force constants this would make delta r be bigger in gravity for the same amplified ħ uncertainty. With the time accuracy of light arrival in the visible range, the radial uncertainty at a high radial distance integrates over the non-linearity of the 1/r^2 force, for a net inward. Tangentially, the integral would net a reduction in gravity.

Δr.GMm.Δt≤ℏ.r2/2

So a partial reason for dark matter and dark energy to be explained by quantum gravity. It’s a simple formula and Δv/Δt as a substitute for Δp=mΔv using F=ma=GMm/r2 in ΔxΔp≤ℏ/2 so the answer is approximate an r±Δr might be more appropriate for exacting calculations, and r2+Δr2 as a tangential hypotenuse.

As https://en.wikipedia.org/wiki/Coulomb%27s_law is 20 orders of magnitude higher the dark coulomb force will be 10 orders of radius larger for the same effect.

As the Mass by the Cube, and the Uncertainty by the Square.

As the distance increases to the centre of a gravitational lens, the uncertainty of the mass radially becomes significant so effectively reducing the minimal acceleration due to gravity, and growing the volume bulk integral of mass in uncertainty. The force delta would be inverse cubic, countered by the cubic growth in integration volume. The force would therefore in isotropy become a fixed quantity effect.

This is not even considering the potential existence of a heavy graviton, or the concept of conservation of a mass information velocity that would have a dark energy effect. It still seems “conservation of acceleration” is not even a taught effect considering there are many wine glasses that would have loved to know about it.

As for the rapid running constant increase toward the unification energy and what inner sun horizons would do to a G magnification? Likely not that relevant? Only the EM force seems to increase in coupling as the energy of the system dilates in time. This would imply the other three standard forces decrease, so necessitating an increase in radial uncertainty on average. The strong force has a with distance effect below the confinement distance, and so as the radius reduces, a Δr.k.Δt≤ℏ/2r rule is likely which would lead to the most likely reciprocal isomorphism of dark matter and dark energy.

Due to quark mass differences, and k, therefore, being one of 15 = 6*(6-1)/2 constants depending on the quark pair a triad product pentad structuring of force to acceleration might occur, with further splitting by boson interactions with quarks. Maybe this is a long shot to infer the finality on the low energy quark set of 6. Likely a totient in there for an 8. That’s all in the phi line and golden, silver and forcing theorems. I wonder if forcing theorems have unforcing and further forcing propergatives?

≤?

You could be right.  So? It’s not as though it affected any of the local accelerators I don’t have. If it’s all about the bit not understood, then as a product constraint, it is where the action is at. As the maths might work, I am speculating the further equations will be in a less than form and so need fewer corrections? Premature optimization is the root? Any tiny effect would be on that side of equality perhaps. Maybe it was just a tilt on the suggestion of an inverse isomorphism. I couldn’t say, but that’s how it exited my mind.

K Ring CODEC Existential Proof

When p=2q. L(0) is not equal L(1).

Find n such that (L(0)/L(1))^(2n+1) defines the number of bias elements for a certain bias exceeding 2:1. This is not the minimal number of bias elements but is a faster computation of a sufficient existential cardinal order. In fact, it’s erroneous. A more useful equation is

E=Sum[(1-p)*(1-q)*(2n-1)*(p^(n-1))*q^(n-1)+((1-p)^2)*2n*(q^n)*p^(n-1),n,1,infinity]

Showing an asymmetry on pq for even counts of containment between adding entropic pseudo-randomness. So if the direction is PQ biased detection and subsample control via horizontals and verticals position splitting? The bit quantity of clockwise parity XOR reflection count parity (CWRP) has an interesting binary sequence. Flipping the clockwise parity and the 12/6 o’clock location inverts the state for modulation.

So asymmetric baryogenesis, that process of some bias in antimatter and matter with an apparently identical mirror symmetry with each other. There must be an existential mechanism and in this mechanism a way of digitizing the process and finding the equivalents to matter and antimatter. Some way of utilizing a probabilistic asymmetry along with a time application to the statistic so that apparent opposites can be made to present a difference on some time presence count.

Proof of Topological Work

A cryptocoin mining strategy designed to reduce power consumption. The work is divided into tiny bits of work with bits of stall caused by data access congestion. The extensive nature of solutions and the variance of solution time reduce conflict as opposed to a single hash function solve. As joining a fork increases splitting of share focuses the tree spread into a chain this has to be considered. As the pull request ordering tokens can expire until a pull request is logged with a solution, this means pull request tokens have to be requested at intervals and also after expiry while any solution would need a valid pull request token to be included in the pull request such that the first solution on a time interval can invalidate later pull requests solving the same interval.

The pull request token contains an algorithmic random and the head random based on the solution of a previous time interval which must be used to perform the work burst. It, therefore, becomes stupid to issue pull request tokens for a future time interval as the head of the master branch has not been fixed and so the pull request token would not by a large order be checksum valid.

The master head address becomes the congestion point. The address is therefore published via a torrent-like mechanism with a clone performed by all slaves who wish to become the elected master. The slaves also have a duty to check the master for errors. This then involves pull-request submissions to the block-tree (as git is) on various forks from the slave pool.

This meta-algorithm therefore can limit work done per IP address by making the submission IP be part of the work specification. Some may like to call it proof of bureaucracy.

The Cryptoclock

As running a split network on a faster clock seems the most effective hack, the master must set the clock by signed publication. On a clock split the closest modulo hashed time plus block slave salt wins. The slave throne line is on the closest modulo hashed values for salt with signed publication. This ensures a corrupt master must keep all slave salts (or references) in the published blocks. A network join must demote the split via a clock moderation factor. This ensures that culling a small subnet to run at a higher rate to disadvantage the small subnet is punished by the majority of neutrals on the throne line in the master elective on the net reunion, by the punitive clock rate deviation from the majority. As you could split and run lower in an attempt to punify!

Estimated 50 pounds sterling 2021-3-30 in bitcoin for the company work done 😀

The Rebase Compaction Bounty (Bonus)

Designed to be a complex task a bounty is set to compress the blockchain structure to a rebased smaller data equivalent. This is done by effectively removing many earlier blocks and placing a special block of archival index terminals for non-transferred holdings in the ancient block history. This is bound to happen infrequently to never and set at a lotto rate depending on the mined percents. This would eventually cause a work spurt based on the expected gain. The ruling controlling the energy expenditure versus the archival cost could be integrated with the wallet stagnation (into the void) by setting a wallet timeout of the order of many years.

A form of lotto inheritance for the collective data duplication cost of historic irrelevance. A super computational only to be taken on by the supercomputer of the age. A method therefore of computational research as it were, and not something for everybody to do, but easy for everybody to check as they compact.

Nitro Bacon COVID Hypothesis

So it seems there is a larger fraction of ethnic dead in the actuary of covid in the UK, and it does not seem to be genetic. This leaves environmental causation. I posit that Bacon and other nitro curing salted meat products are eaten in a larger amount by the sections of populous recording a lower than average actuarial death rate.

The proposed mechanism of action for this effect is through lifting blood pressure by consumption of nitro curing salts and so effecting a partial closure of the ACE2 receptor such that the infection affinity of the covid spike protein is reduced.

Dietary intakes of at-risk population sectors include a reduced-sodium and processed meat intake as a medical diet and may indicate that further research is required on gathering salted preserve intake versus ICU outcome.

I am quite surprised that many apparently random statistics are not captured on the off chance that significance may be shown. It is hardly a problem for the central limit theorem to be applied when the actuary exceeds 100000.

A likely non-chatty and a few more dead seems a better telly for the masses or not? It an’t even cosmically possible to solve language puzzles theses days. A word starting with N and ending in G. can lead to a 24 hour Facebook ban. It could best be expressed by saying Obama didn’t have tits. I appealed, but luckily or not due to covid the bums are not on seats at this time or such gatekeeping. Maybe they all ironically died due to lack of nitro salts? I wonder if the pearly gates they may or may not love has a shoot to hell policy?

Still a few hours before I can create a Facebook group “Borg Unimatrix Thought Distribution Node” for maximal profit. Borg is likely offensive to the Borg as maximal entropy of algorithmic production would likely be higher on the list if elimination of the surplus to requirement individuals was not placed so high.

Medi-ochre and society as corruption lowers society, the leaders can’t help but choose from lesser options and become the pictures of their own making.

Xenozootic Virology

The limited but perhaps influential evidence that covid might have started as an unnoticeable viral cross infection into humans  (Italian smoker study and some Chinese ideals), may be responsible for the 1/3rd no symptom transmission, as it might be possible the Wuhan strain was just a mutation of the unnoticed base virus which became, even more, infectious and had a greater severity.

This knowledge might indicate an occluded outbreak which being of low infectivity and unnoticeable severity might have already travelled enough of the world to infect about 30% of the world’s population so providing some kind of cross-immunity with the Wuhan strain. For all that I know it could have started with some guy called Keith in Hackney Downs.

A backtrace on the per cent of nonsymptomatic in area density across the globe may have indicative potential on the origination of the Pangolin Mary coming into contact with the occluded strain. Although factoring in the kissy romance of the Italian greeting would have to be used to normalize the neutral expectation of transmission under occlusion along with other societal locale idioms. Such things would potentially affect the nonsymptomatic occluded rate compared to the covid hospitalization rate, and hence be estimable to some extent.

The study of R0 unbiased via lockdown percolation along with critical actuarial induction of lockdown would lead to likely numbers on the binding affinity … blah, redacted. **** ****** ** …

I mean like 30% might be one of those 30/70 behaviourisms via some genetic activation, providing pre-MHC preferential or J section locations of activity.

Free Form Thoughts

A Classic Movie Voice Over

And so did the cutter of stone from the sky release the priest of his knowledge of lack of contact such that a stone cold comparison could be seen, and such that it meant that he still would still not know a hug.

And it became decided that the balance between overtaking the lessers versus timed up greaters as an order for the taking sensing a taked in the mistook, all because analytic in speed of absorption, such that little to as much was done.

How to tell the apprentice from beyond thu execution and what of the touchy humours?

And as the unity lowered with the cut words “different cutter” as they appeared. From this a division of opinion ended in more than a happen-seat. And so it was and might is a mighty word.

The multi-cutural (noel) was seen perhaps ower to the hives of man and fortuatous gods or sub-gods. Then what could be done? Why would they prey upon an idol god for it was upon the nature of being that action did perform some or a difference upon tribes and detribulates. If the payment is freedom then what is it to be holden to a duty?

Bode, bode and thrice bode that minus one is a bitch. Obvious dick in womb joke and all. All bar one off course. Yes, an extra-oneous F. Rise again dear cheapo.

And as he placed ring finger of his fishy right hand upon the pre-chopped and processed tree stump, declaring “take it and fuck off”, all was a bit more cagey and costing of those that never get told of the prices of alternate labour avoidance for profit.

Nice story so far dear observer. I think you’d like a little titillation for your money now. Bring forth babe percents and vital statistics.

What a placement of mind in such a being of knowledge. What could become? What it for removals of of thing never cast, never worried, never done.

In the be ginning. A shrrod ploy to an ends. As all became seated and thrust needed no explanation.

All the Too Messy for Sci-Fi Complaints

Assuming GPT-3 is really good at story completion how can anyone say that errors in word sequencing are irrelevant for the provocation phrase issued to an AI when the purpose is completion from the source through sense and not the generation of a more precise bore?

Although the mathematics of a form of complexity may be essential, the actual origin of the mathematics might not be as essential as a way of introducing the definitive emergents as one would assume. Multiple originations of emergence isomorphism in the completeness of behaviour might and likely are possible.

The latest AI joke is about the Silly can’ts versus the car bonned. Oh, dear. 

Gradients and Descents

Consider a backpropagation which has just applied to a network under learning. It is obvious that various weights changed by various amounts. If a weight changes little it can be considered good. If a weight changes a lot it can be considered an essential definer weight. Consider the maximal definer weight (the one with the greatest change) and change it a further per cent in its defined direction. Feedforward the network and backpropagate again. Many of the good weights will go back to closer to where they were before definer pass and can be considered excellent. Others will deviate further and be considered ok.

The signed tally of definer(3)/excellent(0)/good(1)/ok(2) can be placed as a variable of programming in each neuron. The per cent weight to apply to a definer, or more explicitly the definer history deviation product as a weight to per cent for the definer’s direction makes a training map which is not necessary for using the net after training is finished. It does however even further processing such as “excellent definer” detection. What does it mean? 

In a continual learning system, it indicates a new rationale requirement for the problem as it has developed an unexpected change to an excellent performing neuron. The tally itself could also be considered an auxiliary output of any neuron, but what would be a suitable backpropagation for it? Why would it even need one? Is it not just another round of input to the network (perhaps not applied to the first layer, but then inputs don’t always have to be so).

Defining the concept of definer epilepsy where the definer oscillates due to weight gradient magnification implies the need for the tally to be a signed quantity and also implies that weight normalization to zero should also be present. This requires but has not been proven as the only sufficient condition that per cent growth from zero should be weighted slightly less than per cent reduction toward zero. This can be factored into an asymmetry stability meta.

A net of this form can have memory. The oscillation of definer neurons can represent state information. They can also define the modality of the net knowledge in application readiness while keeping the excellent all-purpose neurons stable. The next step is physical and affine coder estimators.

Limit Sums

The convergence sequence on a weighting can be considered isomorphic to a limit sum series acceleration. The net can be “thrown” into an estimate of an infinity of cycles programming on the examples. Effectiveness can be evaluated, and data estimated on the “window” over the sum as an inner product on weightings with bounds control mechanisms yet TBC. PID control systems indicate in the first estimate that differentials and integrals to reduce error and increase convergence speed are appropriate factors to measure.

Dynamics on the per cent definers so to speak. And it came to pass the adaptivity increased and performance metrics were good but then irrelevant as newer, better, more relevant ones took hold from the duties of the net. Gundup and Ciders incorporated had a little hindsight problem to solve.

Fractal Affine Representation

Going back to 1991 and Micheal Barnsley developing a fractal image compression system (Iterrated Systems FIF file format). The process was considered computationally intensive in time for very good compression. Experiments with the FIASCO compression system which is an open-source derivative indicate best performance lies in low quality (about 50%) is very fast, but not exact. If the compressed image is subtracted from the input image and further compressed as a residual a number of times, performance is improved dramatically.

Dissociating secondaries and tertiaries from the primary affine set allows disjunct affine sets to be constructed for equivalent compression performance where even a zip compression can remove further information redundancy. The affine sets can be used as input to a network, and in some sense, the net can develop some sort of affine invariance in the processed fractals. The data reduction of the affine compression is also likely to lead to better utilization of the net over a convolution CNN.

The Four Colour Disjunction Theorem.

Consider an extended ensemble. The first layer could be considered a fully connected layer distributor. The last layer could be considered to unify the output by being fully connected. Intermediate layers can be either fully connected or colour limited connected, where only neurons of a colour connect to neurons of the same colour in the next layer. This provides disjunction of weights between layers and removes a completion upon the gradient between colours.

Four is really just a way of seeing the colour partition and does not really have to be four. Is an ensemble of 2 nets of half size better for the same time and space complexity of computation with a resulting lower accuracy of one colour channel, but in total higher in discriminatory performance by the disjuction of the feature detection?

The leaking of cross information can also be reduced if it is considered that feature sets are disjunct. Each feature under low to non detection would not bleed into features under medium to high activation. Is the concept of grouped quench useful?

Query Key Transformer Reduction

From a switching idea in telecommunications, an N*N array can be reduced to a mostly functional due to sparsity N*L array pair and an L*L array. Any cross-product essentially becomes  (from its routing of an in into an out) a set of 3 sequential routings with the first and last being the compression and expansion multiplex to the smaller switch. Cross talk grows to some extent, but this “bleed” of attention is a small consideration given the fact that the variance spread of having 3 routing weights to product up to the one effective weight and computation is less due to L being a smaller number than N.

The Giant Neuron Hypothesis

Considering the output stage of a neuronal model is a level sliced integrator of sorts, the construction of RNN cells would seem obvious. The hypothesis asks if it is logical to consider the layers previous to an “integration” layer effectively an input stage where the whole network is a gigantic neuron and integration is performed on various nonlinear functions. Each integration channel can be considered independent but could also have post layers for further joining integral terms. The integration time can be considered another input set for per integrator functional.  To maintain tensor shape as two inputs per integrator are supplied the first differential would be good also especially where feedback can be applied.

This leads to the idea of the silicon conectome. Then as now as it became, integration was the nonlinear of choice in time (a softmax divided by the variable as goes with [e^x-1]/x. A groovemax if you will). The extra net uninueron integration layer offering the extra time feature of future estimation at an endpoint integral of network evolved choice. The complexity of backpropagation of the limit sum through fixed constants and differentiable functions for a zero adjustable layer insert with scaled estimation of earlier weight adjustment on previous samples in the time series under integration for an ideal propergatable. Wow, that table’s gay as.

This network idea is not necessarily recursive, and may just be an applied network with a global time delta since last evaluation for continuation of the processing of time series information. The actual recursive use of networks with GRU and LSTM cells might benefit from this kind of global integration processing, but can GRU and LSTM be improved? Bistable cells say yes, for a kind of registered sequential logic on the combinationals. Consider that a Moore state machine layout might be more reductionist to efficiency, a kind of register layer pair for production and consumption to bracket the net is under consideration.

The producer layer is easily pushed to be differentiable by being a weighted sum junction between the input and the feedback from the consumer layer. The consumer layer is more complex when differentiability is considered. The consumer register really could be replaced by a zeroth differential prediction of the future sample given past samples. This has an interesting property of pseudo presentation of the output of a network as a consumptive of the input. This allows use of the output in the backpropergation as input to modify weights on learning the feedback. The consumer must be passthrough, in its input to output while storage of samples for predictive differential generation is allowed.

So it’s really some kind of propergational Mealy state machine. A MNN if you’d kindly see. State of the art art of the state. Regenerative registration is a thing of the futured.

Post-Modern Terminal CLI

As is usual; with all things computing, the easy road of bootstrap before security is just an obvious order of things. It then becomes a secondary goal to become the primary input moderation tool such that effective tooling brings benefits while not having to rely on the obscurity of knowledge. For example a nice code signature no execution tool where absolutely no code even becomes partially executed if the security situation indicates otherwise.

A transparent solution is a tool for development which can export a standard script to just run within today’s environment. As that environment evolves within the future it can take on the benefits of the tool, so maybe even to the point of the tool being replaced purely by choice of the user shell, and at a deeper level by a runtime replacing the shell interpreter at the system level.

The basic text edit of a script at some primary point in the development just requires a textual representation, a checksum in the compiled code which is in a different file and a checksum to allow a text override with some security on detecting a change in the text. This then allows possible benefit by a recompile option along with just a temporary use of the textual version. It won’t look that hard in the end with some things just having a security rating of “system local” for a passing observer.

DCS ASCII Map?

I think I might do a Ham radio licence. I’ve been thinking about it for a few weeks. It might be fun. I’ve been thinking of experimenting with using DCS squelch codes for data transmission of character streams. It should be possible using the 83 codes available with easy mapping.

023@ 114N 205r+ 306lf 4110 503: 703sp
025A 115O 223r- 311′ 4121 506; 712!
026B 116P 226g+ 315( 4132 516< 723″
031C 125Q 243g- 331) 4233 532= 731£
032D 131R 244b+ 343+ 4314 546> 732$
043E 132S 245b- 346, 4325 565? 734%
047F 134T 251up 351- 4456   743^
051G 143U 261dn 364. 4647   754&
054H 152V 263le 365/ 4658    
065I 155W 265ri 371\ 4669    
071J 156X 271dl        
072K 162Y          
073L 165Z          
074M 172*          
  174#          

 

 

This would be easy to integrate into a multipurpose app to connect on digital modes for a low bandwidth 300 baud signal at 23 bits per character. This would be quite reliable as a means of doing a more modern RTTY. Just leaves ` _ | and ~ in base ASCII to do later, with 20 (11-9) codes “free”. The 2xx and the 6xx lines. This gives the printable 63, and the 20 control characters with no print, along with a special control for inclusion in printing (dl for delete correction) for 83.

So the 2xx codes (non-destructive locators except “delete” the anti-time locator) are colour saturation and direction control with delete (which correction “time” dynamics perhaps in a 6-bit code), and the 6xx codes are where more complex things happen. A basis repetition rate for distance starts and the coding uses this as a basis to transmit on. So a basis of 16 repetitions means each symbol is sent 16 times, for a 1/16 data rate. 612 uses 2^n repetitions based on a log for the number of rp after the symbol to be repeated. 2, 4, 8, 16 … after rp, rprp,rprprp … 662 returns to a maximum basis of repetitions and attempts to reduce to keep the number of 627 messages down.

The basis and the use of 612 might lead to a 662 if the decoder is not in synchronization with respect to the basis of repeats. This basis is ignored on the higher-level code and is just a summation of noise to increase S/N by the symbol repetition.

606 sy – synchronous idle 
612 rp – repetition of x[rp]x or x[rp]x[rp]xx (7)
624 ra – rep acknowledge all reps in RX in TX
627 re – rep acknowledge with err correct as 624
631 ri – rep basis increase request (2*)
632 rd – rep basis decrease request (2/)
654 ok – accept basis repetition count by request
662 un – unsync of repetition error reply (max)
664 cq – followed by callsign and sy termination

This allows for a variable data distance at a constant rate especially if the RX has a sampling of code expectation and averaging over the number of symbol reps. It also synchronizes the start of many DCS codes but would reduce the speed of lock to need the code aligned.

Extended codes could be used to extend the coding to include other things. This is not necessary, and 83 symbols are enough. This is a good start, and extras are fine though. Even precise datarate coding lock would give better performance over DX at high repetition basis.

A modified form of base64 encoding along with digital signatures (El Gamel?) could provide good binary 8-bit transmission, and block reception good certainty. A return of the good signature or the false signature on error makes for a good block retransmit given a simplex window size of 1. In this case, synchronous idle would be a suitable preamble, and the 2xx and 6xx codes would be ignored as part of the base64-esque stream (except 606 for filling in empty places in the blocks of 5 in the base64 code).

Time Series Prediction

Given any time series of historical data, the prediction of the future values in the sequence is a computational task which can increase in complexity depending on the dimensionality of the data. For simple scalar data a predictive model based on differentials and expected continuation is perhaps the easiest. The order to which the series can be analysed depends quite a lot on numerical precision.

The computational complexity can be limited by using the local past to limit the size of the finite difference triangle, with the highest order assumption of zero or Monti Carlo spread Gaussian. Other predictions based on convolution and correlation could also be considered.

When using a local difference triangle, the outgoing sample to make way for the new sample in the sliding window can be used to make a simple calculation about the error introduced by “forgetting” the information. This could be used in theory to control the window size, or Monti Carlo variance. It is a measure related to the Markov model of a memory process with the integration of high differentials multiple times giving more predictive deviation from that which will happen.

This is obvious when seen in this light. The time sequence has within it an origin from differential equations, although of extream complexity. This is why spectral convolution correlation works well. Expensive compute but it works well. Other methods have a lower compute requirement and this is why I’m focusing on other methods this past few days.

A modified Gaussian density approach might be promising. Assuming an amplitude categorization about a mean, so that the signal (of the time series in a DSP sense) density can approximate “expected” statistics when mapped from the Gaussian onto the historical amplitude density given that the motion (differentials) have various rates of motion themselves in order for them to express a density.

The most probable direction until over probable changes the likely direction or rates again. Ideas form from noticing things. Integration for example has the naive accumulation of residual error in how floating point numbers are stored, and higher multiple integrals magnify this effect greatly. It would be better to construct an integral from the local data stream of a time series, and work out the required constant by an addition of a known integral of a fixed point.

Sacrifice of integral precision for the non accumulation of residual power error is a desirable trade off in many time series problems. The inspiration for the integral estimator came from this understanding. The next step in DSP from my creative prospective is a Gaussian Compander to normalize high passed (or regression subtracted normalized) data to match a variance and mean stabilized Gaussian amplitude.

Integration as a continued sum of Gaussians would via the central limit theorem go toward a narrower variance, but the offset error and same sign square error (in double integrals, smaller but no average cancellation) lead to things like energy amplification in numerical simulation of energy conservational systems.

Today’s signal processing piece was sparseLaplace for finding quickly for some sigma and time the integral going toward infinity. I wonder how the series of the integrals goes as a summation of increasing sections of the same time step, and how this can be accelerated as a series approximation to the Laplace integral.

The main issue is that it is calculated from the localized data, good and bad. The accuracy depends on the estimates of differentials and so the number of localized terms. It is a more dimensional “filter” as it has an extra set of variables for centre and length of the window of samples as well as sigma. A few steps of time should be all that is required to get a series summation estimate. Even the error in the time step approximation to the integral has a pattern, and maybe used to make the estimate more accurate.

AI and HashMap Turing Machines

Considering a remarkable abstract datatype or two is possible, and perhaps closely models the human sequential thought process I wonder today what applications this will have when a suitable execution model ISA and microarchitecture have been defined. The properties of controllable locality of storage and motion, along with read and write along with branch on stimulus and other yet to be discovered machine operations make for a container for a kind of universal Turing machine.

Today is a good day for robot conciousness, although I wonder just how applicable the implementation model is for biological life all the universe over. Here’s a free paper on a condensed few months of abstract thought.

Computative Psychoanalysis

It’s not just about IT, but thrashing through what the mind does, can be made to do, did, it all leverages information and modeling simulation growth for matched or greater ability.

Yes, it could all be made in neural nets, but given the tools available why would you choose to stick with the complexity and lack of density of such a soulution? A reasoning accelerator would be cool for my PC. How is this going to come about without much worktop workshop? If it were just the oil market I could affect, and how did it come to pass that I was introduced to the fall of oil, and for what other consequential thought sets and hence productions I could change.

One might call it wonder and design dress in “accidental” wreckless endangerment. For what should be a simple obvious benefit to the world becomes embroiled in competition to the drive for profit for the control of the “others” making of a non happening which upsets vested interests.

Who’d have thought it from this little cul-de-sac of a planetary system. Not exactly galactic mainline. And the winner is not halting for a live mind.

UAE4ALL2 on Android with Amiga Forever

It works better than uae4arm when you have not much memory internally free as both the system and work drives can be on the SD card. It does involve making an extra System.hdf in a desktop tool and performing a copy <from> to <to> all clone after formatting the system disk as something named other than that e.g. Workbench so the copy works.

The directory for the Work directory can be copied off the Amiga Forever CD (which you own), and placed in the folder <StorageDevice>/Android/data/atua.anddev.uae4all2/files along with the System.hdf as the app only allows one of each and boot from one. It also seems to not allow some combinations, and a bare file system on the Work is better than the otherway round.

If you get the ROMs too from the CD, and place them in there, you get a purple boot screen, for some reason it needs a app emulation restart to use the disks in my configuration. The mouse is horrible, and so a little USB mini keyboard and trackpad combo is essential. You kind of have to have a bit of font imagination until you set the screen mode (which also needs a shutdown and restart).

QtAp Getting Better

So the app is getting better. The “interfaces” for the extensions have been defined for now, and just doing the last functions of UTF import to bring it up to the level of building the first view. The command menu has been roughly defined, and changes based on the view.

Qt so far is quite nice to use. I have found as an experienced C/Java coder, much of the learning curve is not so much finding the right classes, but the assumptions one has to make on the garbage collection and the use of delete. In some cases, it is obvious with some thought (local variable allocation, and automatic destruction after use), while in others not so (using a common QPlainTextDocument in multiple widgets and removing the default ones). Basic assumption says pointer classes have to be manually handled.

https://github.com/jackokring/qtap/blob/master/classfilter.h is a category filter based on an extensible bloom filter. The .c file is in the same directory.

N.B. It’s so funny that some “amazing” hackers can bring down this sub $10 server. Way to go to show off your immense skill. A logline 142.93.167.254 – – [19/Jan/2020:08:38:01 +0000] “POST /ws/v1/cluster/apps/new-application HTTP/1.1” 403 498 “-” “python-re$ … etc. I’m dead sure no such thing exists on this server. And the /wp-login automated port 80 hammering for services not offered.

But enough of the bad news, when something along the lines of maximal entropic feature virtualization sounds like something nice (or not). Who knows? What’s involved? Somekind of renormal on the mapping of k-means for a spread which is morphing the feature landscape to focus or diverge the areas to be interpreted?

QtAp Release v0.1.13

GitHub Pages

It’s not great, but quite a nice experience with undo/redo, and Git integration. I even added the translation engine as part of the release, but have done no actual translations. It’s a better app initially as it includes some features that will consume time to add to the example notepad app.

Also in the background there is quite a bit which has been done which is ready as soon as the app develops, such as the interception of the action bar such that right click can show hide which sections are visible, and this is saved as part of the restored geometry.

0.1.13 QtAp Releases and Development

The getting of the greying out of menus and the action buttons depending on state was a nice challenge to learn the signal slot methods Qt uses. The tray is automatically generated too depending on the calls to the addMenu function, and the setting of flags to indicate state response routing.

I’m likely to even build a JavaScript host in there for the user and add in some extras for it, as this seems the most obvious way of scripting in the browser era. There is also possibilities to build new views by QML and so allow some advanced design work under the hood, while maintaining a hybrid approach to code implementation.

I cheated quite a lot by having a dependency on Git and so SSH. I’m not sure I even need the socket interface as long as I do some proxy code in JS to move data to and from C/C++.

Moving on to adding features to the interface, and a command menu which has selections based on the current active view. This could be done by buttons, but actions are better as they have a better shortcut method, and easier automatic accesibility tool interfacing.

The icon set likely needs a bit of a spruce up, and matching with some sensible default. Maybe adding in the cancelation of a bash sequence so that anti-commands can be supplied in a list, and run if it makes sense to reverse. Maybe later, later.

EDIT: The overriding of a class when it is attached to a GUI is slightly complex. I found the easiest way (not necessarily the most efficient as it depends on how the autogenerated setupUi function saves memory when not executed. The super class needs a simple bool stopUi = false with the extending subclass just passing this second parameter as true and putting an if(!stopUi) execution guard before the super class ui->setupUi(this) call in the constructor.

This allows QObject(parent) to be replaced by superClass(parent) to inherit all the “interface”. There maybe other ways using polymorphism, but none as easy.

X16 Millfork Progress

So the basic plot and tile map arrangements are made. Next up is the “open and parse” BMP file from where the currently emulated virtual SD Card will be. This will then allow me to get on with some simple graphics, a compressed large map format and joystick motion of a sprite around the map.

I’m now used to the syntax enough to go for a generic file open routine, and deal with any PETSCII encoding problems. I’ll have to check to see if I can get a list of the directory today. And yes it seems like device 1 is the device to use. The fact that cbm_file library is currently buggy is maybe based on the advice in the programmer’s handbook to use 255 as the auxilliary command, but who knows? It does seem like OPEN <X>, 1, <W(1)/R(0)> is the one, but then the close call crashes as the open failed. The read likely returns a fail code but “works” without a device error.

Millfork and the emulator are now upgraded, and I must try to see what the fixes are, as some breaking changes have been made. Still waiting on the role out of the VSCode plugin update and might have to manual install it from source. Still good for helping with the build and test though. It’s fun although a little frustration when off spec features are yet to be done.