QMK Link Time

So having got the new build process working as expected (some python ~/.local cache storage issue), link time optimization provides almost 4 kB for future features. Latest master build this is quite a lot of feature space when considering such a small microcontroller. A nice feature added to the more modern build options.

I should be able to put together a nice Crimbo update with such a gargantuan amount of code memory, and no need to duplicate features the computer should do easily. In some ways the difficulty is making something extra which might get used.

Docs for amperzand is a little experiment in server languages I’m thinking about. Anonymous inner class? Sounds like it might be cool. So there is still about 3 kB left after double striked letters (for drop capitals perhaps). So some 46 macros to define before I have to get a little more creative in what’s available.

Ltd. Still and QMK.

Yes the QMK active branch and some news that my accounts are now filed. Zero in/zero out as a boring COVID an low contact availability year. So Ltd. status continues as far as I’m aware. I’ll keep you all informed.

So now the send_unicode_string() function is used for a macro system within the keymap latest coding. This is opposed to how the macro layer emits function key combinations, which is more in line for a tool on the computer handling it. I’ve also added repeated substring compression too. (“\\0” to “\\9”)

So more of a hard baked solution, but does allow more complex multi-character glyphs to be produced instead of just one Unicode code point. So with about 1100 bytes free about, that’s about thirty UTF-8 bytes per key action (over the 37 defined key action macros). Even “\\\\” is defined for emitting an initial backslash, just so backslash can be used as a prefix for more complex macro processing than just print until end of string.

Other macro features like tapping key codes for nested macros? Yes, if the keycode is added to the array so “\\A” will tap the first keycode.

QMK Keyboard Again

Latest 2021-11-22 commit goes for an 10 layer design with the language BQN built in and three further planes of Unicode glyphs. This leaves 2296 bytes of firmware left for further adaptions. Seems the lock key option consumes quite a bit and I don’t need it.

Altering the U’ ‘ defines such that a Unicode glyph is copied between the single quotes would add a Unicode character to the design.

The control iconographs for example:

[IAT] = U'⚠', [IA] = U'⟁', [IB] = U'🗚', [IC] = U'🗐',
[ID] = U'🔖', [IE] = U'🔎', [IF] = U'👍', [IG] = U'🔔',
[IH] = U'⌫', [II] = U'⭾', [IJ] = U'⏎', [IK] = U'⭿',
[IL] = U'📇', [IM] = U'✓', [IN] = U'🗋', [IO] = U'🗁',
[IP] = U'🐧', [IQ] = U'📤', [IR] = U'📥', [IS] = U'💾',
[IT] = U'🌱', [IU] = U'👎', [IV] = U'📋', [IW] = U'🔑',
[IX] = U'🗙', [IY] = U'🗜', [IZ] = U'⎌', [ILBR] = U'⎋',
[IBSL] = U'🌍', [IRBR] = U'☣', [ICAR] = U'⚗', [IUND] = U'☢',
 
Are picked to represent general principals of the control characters in a modern computer environment. Some of them may be difficult to understand at first but for example the last row could be considered ECO/BIO/CHEM/PHYS, on a atomic building.
 
Deciding on the extra control layer glyphs as they don’t have ASCII slots but are possible to type is a bit more complicated. I’ll give them a bit more thought. 
 
There’s an interesting VSCode crash 139 development (SEGMENTATION FAULT) just occurred which shouldn’t be happening but is a crash in the rendering process. Obviously some “bad code” in VSCode?
 
I’ve improved the shift mechanism for some of the extended layers, and filled in the ANSI control code layer to my satisfaction. I’ve finalized the Navigation and Macro layers to final satifaction and added a number lock on the Magenta shift of the Macro Yellow layer.
 
 

K Ring Technologies Unlimited and MOND Galactic Equivalence

Due to the facinating non-working companies house service for email notification for upcomming filings which are needed not working in the COVID period I have miss the accounting date and K Ring Technologies Ltd, is to be struck off the companies register. All business is therefore to become sole trader until such time as sense resumes.

But apart from that I’ve extending the quantum uncertainty in a gravitational field idea and come up with

1/(r-r^2.dr)^2-1/(r+r^2.dr)^2

As a dipole expansive explination for dark matter and thruough the singularity of the first term an eventual repulsive dark energy kind of force. In the dipole limit in a galaxy for example the force will aproximate 1/r and so is effectively a MOND on the small galactic scale.

Looks like the Net needs a reflective NAK uptake NAK shutdown.

And so some arbitary bollox began. Take self query and accept delay of self accessing you to provide NAk nothing or shutdown as your proto want DoS attck. End, happy music continues. If bought net percent, then for purpose?

Seems 17/9/2021 has a dislike of digital flux to elsewhere. It’s surprising how a 10% NAK request packet NAK happened before could force upon the uptake of the promotion of the internet vesrus those which refuse cache and just want DoS attack without in a sense showing provido.

Why do cells recycle approximately 10% of themselves toward an imune system which kills the not of body? Why does the engineering of telecoms having a standing 10% use of bandwidth to achive effective 90% use goals?

It’s not my number, just the IP/ISP stack. It could just be the sound of Musk and the super-titty highway, but not likely?

VCV Rack Again

Now that VCV Rack virtual modular synthesizer has stabalized I tried it out for developing C++ modules to add in to the rack. Haven’t decided on the exact nature of the modules yet (2021-7-14), but it does work for the developer better than the older version. First a full source compile is not required. There still appears to be some issues with control alignment in the module GUI which is auto generated which is easily fixed by adding 28 to the Y coordinate of the control. My advice is leave the control “circles” visible until the actual controls cover where you intend them placed.

KRT Plugin A is the repository. So module A seems to be a filter with some strange DSP and input option for a HPF/LPF ring modulation with a metalic on the corner frequency offsetable from the main Q frequency but with tracking added on independantly. Is this the most fun that can be had with 4-poles? An electronic DSP filter joke. So the test basically works, but some DC on none HPF path, but as HPF is by inverse LPF and DC cancels, there must be some DC injected somewhere. OK, found the obvious error at last. Filter A is finished.

And now polyphonic with SIMD. This makes it about 7 times more efficient. Now has graphics. Developing module μ for calculus purposes, then onto a few other niceties. module μ is finished too. Any errors in the calculus should be reported as this module is about a calculated sound, and errors sounding better are for other modules.

On version 1.2.3 already 😀 as it even took this long to define a suitable versioning number system. At the moment controls are virtually CNC’ed on a grid, and panel graphics are manually kerned (as auto(Tm)ating this is perhaps overkill. v1.3.4 release now include the fat T with some bad disharmony as well as lovely 4th and 5th sync sounds and stuff.

L;D and R are now in planning. And are completed for the 1.6.9 release (2021-07-27). Maybe some speed optimizations and next a more complex module. A nice website is also being made in markdown here as that’s what is expected by default .json tags.

The 1.8.13 release (2021-8-2) includes 8 working modules with the new ones being Ω (a clock distributer with randomness) and V (a VCA triplet). The 1.9.15-rc2 relaease (2021-08-05) includes F a morph filter. Hopefully goes live soon soon when compiled by VCV. The Y gate sequencer is almost ready for 16 channels of triggers.

15 machines up there now (2021-8-19) including some oversampled ones, and some utilities for helping out with problems you didn’t know you had. One minor fix for the F filter will be in the next release, and some slight improvements, plus another 3 modules. 

 

Recursive Predictive Neural Networks

Given that the output of a neural net can be represented as y derived from an input x and a feedback operator f(y) the network can be trained on which may include differential and integral operators in the operator f. As f(y) can be considered to be the feedback synchronization point which is clocked to transit the network forward in prediction, f(y) is delayed in y such as to be f(y(t-1, …, t-n)) is the applied feedback to stop “epileptic oscillation” of the forward net function.

The network itself can be programmed on the sequence to learn in an open loop gradient decent and the bias of x activation to f(y) remembrance by either weighting or digital percent application gating. The pattern to lock onto for an input can be trained independent of an input, and then offset by application of the triggering input to balance activation of one output versus another. The actual spreading and maximization of the output attractors becoming disjunct from instancing which attractor to present as output from input.

The “old” feedback from the “last” remembered thing introduces some chaos and mal-attractor effect. This can be removed a little by using an expected previous context training pre-sequence. This can then also introduce contextual recall. The “short term memory” being the contextual state of y, so programming the long term sequence prediction memory with context y and stimulus x.

The production of optimal context for stimulus itself become a network programming challenge. It represents the concept of changing predictive utility. As the forward transfer of the network produces the output to feedback, the network itself could produce the optimal context from the requirements delivered through part of x deciding the contextual decode mode. A separate net to organize the change of context in bulk would have specialization separation and generation of terms in parallel advantages. In utility though it would only be used to switch contexts, or cross imagine contexts to place the prediction net on a creative sequence.

This could have application when the context is considered a genetic algorithm process for tuning the network to produce some kind of granular attractor synthesis. The process of providing the scoring feedback in synthesis mode controlled by a hardwired concept of misadventure excursion in the prediction. Another network for bad state recognition to complement the entropy generative context granularization network? So the reality predictive network is contextualized, granulated and tested for productive futures. Then a final factorization of synthetic addition requirements of the imagined product can be performed by a final independent network.

Consciousness is within this last network as the self image of adding self as a possibility factor. The production of a threshold of motor action to produce an attempt at achieving the estimated reality granularization (subject to bounds constraints) being the primary motivator.

A Speech Action Co-ordination Domain

If the input x, and the output y with feedback descriptions, current “genetic” gene combinators and more can be serialized as a inter AI language, the projection of multiple “conscious” entities in the predictive net of reality simulation can engage in a factors for product optimization as well as other non zero sum optimizations. A net to process one internal representation to another with an acknowledge of simultaneous state with confusion feedback. At higher data rates a negative acknowledge protocol can take over with estimations of animism action between confirmation certainty with residual accidental error bounding.

A Survival Function

The selection basis of the context provided to the reality estimation can adapt to return a higher valuation of the survival “situation understanding” function. This in the real sense is the optimization function for selection of purpose. The reality function just attempts to maximize a correct simulation of reality. The context function attempts to maximize use of granular entropy to increase the coverage range of the reality simulation to increase options of consciousness to action. The action threshold function then decides if the likely action chosen is done, and in a way represents a kind of extrovert measure of the AI.

Component Parts

  • Reality simulation (estimation)

  • Reality factorization (situation)

  • Granular imagination (context)

  • Action selection (desire)

  • Input processing (percept)

Using some kind of Fibonacci growth connection in a surface topological toroid? That would be more on hardware interconnect optimization. Of more interest to the feedback in the reality simulator would be the parametrized operators building differential and integral representations from the feedback. Of the three forms of end point integral, all could be represented. The fact that the log kind has complex series to evaluate, and has no necessary complex log representation might be an added difficulty but would “lock” onto such functional time generatives.

Negative time offsets on the end point limit on such integrals when complex processing is applied introduce the idea of the 2*pi synchronous summand based on angle, as this maybe a better input controlled output representation of the complex domain for an N:1 mapping. A Gaussian distribution of error about the coefficient division.

Chaos Measure

The feedback operator f depends on calculation of differential and integral functions based on weighted sums of y at various t and so it could be said that any initializing or changing of the reality simulation to another play back “granule” has some new data placed in the feedback memory. This new data can have a varied impact based on the likely-hood estimation of the time samples having an impact on the calculated differential and integral values along with sensitivity to the feedback signal. This implies each memory bit has some measure of bit change (in a genetic algorithm mutation) on the divergence from the reality simulation. This then can be used to infer a focus mask. The use of gene crossing focus weighting or masking then synchronously produces a chaotic deviation from the training reality.

Modulation of the stored memory context would appear on some level equivalent to altering the coefficients of the estimates for differentials and integrals, but as the chaos measure is a deviation control from an exacting physical model of time evolution, it is thought better to keep the operator mathematics at a static precision, and deviate granularity by memory modulation.

For example 1, -9, 36, -84, 126, -126, 84, 36, 9 are the coefficient to predict the future next sample from the previous nine samples based on a zeroth differential estimate. In open loop training the feedback would introduce a delay step, but prediction of the future would in effect cancel this delay so that effectively the f(y) does not have to be calculated and y can be used. The large range would create some oscillation as the context shift registers were filled with data to feedback. This open- loop programming without reference to f allows pre-training without any feedback instability but with a later oscillation about the manifold.

Computational stability requirements are improved if the feedback f is amplified by default expectation, as this forces some non-linear mixing of x to reduce the net summand, moving the bode point of the feedback away from the inactive denormalized zero value. It also increases the net feedback applied to keep the reality simulator feed forward gain below one.

All n orders of differential can be cast as future predictions, and all the integral accelerated forms can be represented with future casting into any t with some renormalization possible but not essentially a necessity. In fact a rectangular offset in the y-axis integrates as a ramp addition to a monotonically increasing sum. Can the network learn a root finding algorithm for applied integral time when wired with learnable pass through of a variable integration time? This time offset from the future prediction time (integral offset time) u can be fed into the operator f and passed through as f(y(t(n)), u(t(n))) with some of the prediction y being used as u.

Alias Locking

In any synchronous DSP circuit with non-linear effects the requirement to keep x and f(y) within the frequency range where alias distortion would potentially present as false signal does indicate that the coefficients could be modified to provide an alias filter. But it maybe found that a small chaotic dither dithers the aliases further and leads to a wider band spreading about an alias. The detection of a coincidental alias may aid detection of the signal expected. This extra minimal noise could be extracted from the environment by deviations from expectation. An AI task of removing aliases may be considered as something that could be learnt, but also generating an inverse filter to supply the alias spectrum (excluding sub-harmonics of the clock rate).

Consciousness as the Correlated to Self Action

When the self action of the model produces a correlation in the reality simulation it could be said to have observed a correlation to self in the model. The relation to the situation factorization domain then becomes an obvious connection to equation of virtual actionals given the real actional set. This allows futures, and past observational training. The weighting function of physical error cutting a cookie of size survival plus some splurge.

So it seems “pain” or some milder proxy for bad function should increase situation recognition, reduce recent action, increase the accuracy of reality simulation, improve the percept and perhaps change the context toward know safe positives. An autonomic bypass from the percept to counter action is likely also “grown”.

Factors

The situation analysis net is likely better functional with some feedback. The purpose of this feedback in not time evolution estimation like in the reality simulation, but the use of the factorization of the situation in building a system of meta situational analysis which could include self consciousness. Technically the feedback could be nested recursively and be applied as part of the x input of the reality simulation, but that makes for more complex training. 

Considering that many factorization domains have a commutivity structure it implies that post convolution might be a good way of splitting the network result into “factors”. This is placing the convolution as the last layer and not the first layer.

Or FFT for that matter, and in some sense, this layer becomes the first layer of the action decision net of desire.

Percept

Estimation

Situation

Desire

 

Context

   

And the variational encoder ratio for optimal mixing of the networks?

Section

Technologies

Percept

Variational auto-encoder. Maximal representation of externality. Normalization average.

Estimation

Time evolution feedback via calculus operators.

Context

Produce genetic algorithm modification for estimation feedback.

Situation

Variational auto-encoder with post convolution or ideal order factorization of variation and causation tree.

Desire

Threshold action sequencer. Classifier with threshold.

The unity of consciousness as that identified with the knowing of multiple action paths in the imagination as capable of altering a future percept and certainty in achievment of a happy context and situation.

This extends on to the idea of emotive functor attractors as the controlled mechanism for genesis of output from the actional desire. This separates desire as an actional devoid of emotion, in complex with a driving emotion set. What has become of the splurge of biological evolute on the smudged cross product? Does it really assist functional understanding of the power efficiency of self action?

The situation analyser in performing a domain factorization, applying a feedback and estimation of a rule and a correlative later situation could in principal assist with modelling from rule followed by implication of rule. The Gödel incompleteness of the inferred logic controlled by “your stupid” and the implicant “fix yourself” as a splurge cull.

The convergence of the multiple series for different integral forms have bounds. These could be considered some sophisticated parallel to attractor convergence in fractals. As they have a possible intersection as well as a pseudo digital behaviour (time analytic of halting problem applied to divergence) they can be used to represent some digital manifold, while maintaining series differentiability. This implies c(y) and f(c(y)) more importantly be fed back to the estimation.

The separation of the percept before the estimation in a real sense is the great filter. Some post situation feedback would help. The log scaling is perhaps also quite important. Considering an exponential half life maybe controlled by production of an enzyme to remove the metastable precursor to reduce it, the multiplicative inverse is quite likely (Newton-Raphson approximant) and integration make for a log scaling possibility. Some feed forward of x provides entropy and some exponentiation or other series decompositions might be useful.

Templigeadicalogical Algebra

Well, where shall I start? All could see the sums were good, of those taught sums a protected right of the conversationally on. A heavy reign introduction to subtraction and hence divisional and rights of subdivision consensus elective protection from decimation.

Command hierarchies of the on for example a battle unit knowing it is one that is on, set to fight for subfeariors of command wants subjunct summative transmissive networks of optimization feedback of induction of sums?

The out wave of a famine “un” the handy past the time of fight show lean on the order of meditative? As the collective induced cook as woman work to collect the connection of hand fights to womb growth multipliers.

This might be fun at the food policy unit 😀

Fight coming up from blame of “monk” to obivatiate doubt on family protection, and thrust occasional mindsets to perimeters of risk reduction from onslaught foresightful of the time to tummy from mummy. As the di summand moved predecimand, the focus on god cycle before analytic deconstruction by sumandment of temple duty, became moot as the knowledge collapse in chaotic cycles not brought into feedback bode stability as the control hierarchy became argumentative dominant.

Bun fight!!!! Let then ate cake?? I’m ‘avin a go at integrands, might PID control feedback stuff if the boding camlmand hireachies summand with better cross information flow? Sumands like ? The first L of simon. Technically though it is a second order tension of l fighty fiesty hg an onset of bode instability so yicked up set delguage from a fWell, via a dissociable epigenetic panic.

Prove me hyper politely wrong on the abuse that extends from the fear of critique. I’m on, some of the nicest people I know are women. There on, but maybe not on on in the wordfield. The uncertainty potential of action in fold downs of understanding? I can say being a man I understand testosterone. The idiomatic fork as well extends from this therefore the competition between communicative and full active fully automatic via lack of information has its inductive effect.

Anti June could swear on summand a sailor! Error analysis in.  Idiomatic jokes are always a shitter. Control yourshelbves bint dat ladies. First orders, second orders power orders, summands and seais so ship? Lndend? Trade …

Accents for the poor? Accination programs? The enlightenment of the orbifold tonces. The dictum freeze from Oxford, an experiment in analytical management by saturnalian net. A distribute of multi-lingo automation?  Distribute, estimate, summand, perform error control minimization. Unlimnate uncertainty of position.

Parse Buffer Overflows? Dark Priorities.

Sounds like such fun. An irremovable or a point update fix on the press? https://github.com/jackokring/majar/blob/master/src/uk/co/kring/kodek/Generator.java sounds like fun too. Choices, choices? Amplified radial uncertainty of Δr.GMm.Δt≤ℏ.r2/2 was kind of the order of last night. Is it dark matter? Is tangential uncertainty in the same respect part of dark energy? The radial uncertainty in a sure instant of time, and the potential gravitational energy? A net inward force congruent with dark energy?

And a tangential version of the squared hypotenuse of radius and tangential uncertainty of radius resultant? That leads to a reduction of gravity at a large radius and is more like dark energy. More evidence for a spectrum of uncertainty amount hence the “less than equals” being simplistic on an actuality?

Oh, no I’ll have to investigate the last GET/POST before errors … how boring (last time an Indian) … guess who?

The Small Big G and Why Gravity?

As G the gravitational constant is small compared to other force constants this would make delta r be bigger in gravity for the same amplified ħ uncertainty. With the time accuracy of light arrival in the visible range, the radial uncertainty at a high radial distance integrates over the non-linearity of the 1/r^2 force, for a net inward. Tangentially, the integral would net a reduction in gravity.

Δr.GMm.Δt≤ℏ.r2/2

So a partial reason for dark matter and dark energy to be explained by quantum gravity. It’s a simple formula and Δv/Δt as a substitute for Δp=mΔv using F=ma=GMm/r2 in ΔxΔp≤ℏ/2 so the answer is approximate an r±Δr might be more appropriate for exacting calculations, and r2+Δr2 as a tangential hypotenuse.

As https://en.wikipedia.org/wiki/Coulomb%27s_law is 20 orders of magnitude higher the dark coulomb force will be 10 orders of radius larger for the same effect.

As the Mass by the Cube, and the Uncertainty by the Square.

As the distance increases to the centre of a gravitational lens, the uncertainty of the mass radially becomes significant so effectively reducing the minimal acceleration due to gravity, and growing the volume bulk integral of mass in uncertainty. The force delta would be inverse cubic, countered by the cubic growth in integration volume. The force would therefore in isotropy become a fixed quantity effect.

This is not even considering the potential existence of a heavy graviton, or the concept of conservation of a mass information velocity that would have a dark energy effect. It still seems “conservation of acceleration” is not even a taught effect considering there are many wine glasses that would have loved to know about it.

As for the rapid running constant increase toward the unification energy and what inner sun horizons would do to a G magnification? Likely not that relevant? Only the EM force seems to increase in coupling as the energy of the system dilates in time. This would imply the other three standard forces decrease, so necessitating an increase in radial uncertainty on average. The strong force has a with distance effect below the confinement distance, and so as the radius reduces, a Δr.k.Δt≤ℏ/2r rule is likely which would lead to the most likely reciprocal isomorphism of dark matter and dark energy.

Due to quark mass differences, and k, therefore, being one of 15 = 6*(6-1)/2 constants depending on the quark pair a triad product pentad structuring of force to acceleration might occur, with further splitting by boson interactions with quarks. Maybe this is a long shot to infer the finality on the low energy quark set of 6. Likely a totient in there for an 8. That’s all in the phi line and golden, silver and forcing theorems. I wonder if forcing theorems have unforcing and further forcing propergatives?

≤?

You could be right.  So? It’s not as though it affected any of the local accelerators I don’t have. If it’s all about the bit not understood, then as a product constraint, it is where the action is at. As the maths might work, I am speculating the further equations will be in a less than form and so need fewer corrections? Premature optimization is the root? Any tiny effect would be on that side of equality perhaps. Maybe it was just a tilt on the suggestion of an inverse isomorphism. I couldn’t say, but that’s how it exited my mind.

Project “majar”

majar is a Java package. It is in development. The current focus is on a shell language. Also, abstraction interfaces for things will arrive, and then implementations.

  • KeyBase – a database based on the idea of a “BulkStream” (a Base) supporting a 5th normal form of relational database where internally everything inherited from Key can be stored. Fields are not stored in records, but each field value becomes a record so an object storing kind of database. Imagine a field query, and then obtain records for all tables and databases.
  • Kodek – the KODEK of K Ring naming fame. I’m sure I’ll get around to a KeyBase Store specialising in a Kodek supporting Class.
  • majar (intentionally lower case) – a scripting language with a bash launch script so that majar becomes the language from “ma.jar”, get it? 
  • Abstracting the script language to run web applications.
  • A Java Servlet extension class for easing some of the pain.
  • A port 287 IP proxy for localhost compression to the publication of “My Public Computer” interface. Also an application server for the browser client.

K Ring CODEC Existential Proof

When p=2q. L(0) is not equal L(1).

Find n such that (L(0)/L(1))^(2n+1) defines the number of bias elements for a certain bias exceeding 2:1. This is not the minimal number of bias elements but is a faster computation of a sufficient existential cardinal order. In fact, it’s erroneous. A more useful equation is

E=Sum[(1-p)*(1-q)*(2n-1)*(p^(n-1))*q^(n-1)+((1-p)^2)*2n*(q^n)*p^(n-1),n,1,infinity]

Showing an asymmetry on pq for even counts of containment between adding entropic pseudo-randomness. So if the direction is PQ biased detection and subsample control via horizontals and verticals position splitting? The bit quantity of clockwise parity XOR reflection count parity (CWRP) has an interesting binary sequence. Flipping the clockwise parity and the 12/6 o’clock location inverts the state for modulation.

So asymmetric baryogenesis, that process of some bias in antimatter and matter with an apparently identical mirror symmetry with each other. There must be an existential mechanism and in this mechanism a way of digitizing the process and finding the equivalents to matter and antimatter. Some way of utilizing a probabilistic asymmetry along with a time application to the statistic so that apparent opposites can be made to present a difference on some time presence count.

Proof of Topological Work

A cryptocoin mining strategy designed to reduce power consumption. The work is divided into tiny bits of work with bits of stall caused by data access congestion. The extensive nature of solutions and the variance of solution time reduce conflict as opposed to a single hash function solve. As joining a fork increases splitting of share focuses the tree spread into a chain this has to be considered. As the pull request ordering tokens can expire until a pull request is logged with a solution, this means pull request tokens have to be requested at intervals and also after expiry while any solution would need a valid pull request token to be included in the pull request such that the first solution on a time interval can invalidate later pull requests solving the same interval.

The pull request token contains an algorithmic random and the head random based on the solution of a previous time interval which must be used to perform the work burst. It, therefore, becomes stupid to issue pull request tokens for a future time interval as the head of the master branch has not been fixed and so the pull request token would not by a large order be checksum valid.

The master head address becomes the congestion point. The address is therefore published via a torrent-like mechanism with a clone performed by all slaves who wish to become the elected master. The slaves also have a duty to check the master for errors. This then involves pull-request submissions to the block-tree (as git is) on various forks from the slave pool.

This meta-algorithm therefore can limit work done per IP address by making the submission IP be part of the work specification. Some may like to call it proof of bureaucracy.

The Cryptoclock

As running a split network on a faster clock seems the most effective hack, the master must set the clock by signed publication. On a clock split the closest modulo hashed time plus block slave salt wins. The slave throne line is on the closest modulo hashed values for salt with signed publication. This ensures a corrupt master must keep all slave salts (or references) in the published blocks. A network join must demote the split via a clock moderation factor. This ensures that culling a small subnet to run at a higher rate to disadvantage the small subnet is punished by the majority of neutrals on the throne line in the master elective on the net reunion, by the punitive clock rate deviation from the majority. As you could split and run lower in an attempt to punify!

Estimated 50 pounds sterling 2021-3-30 in bitcoin for the company work done 😀

The Rebase Compaction Bounty (Bonus)

Designed to be a complex task a bounty is set to compress the blockchain structure to a rebased smaller data equivalent. This is done by effectively removing many earlier blocks and placing a special block of archival index terminals for non-transferred holdings in the ancient block history. This is bound to happen infrequently to never and set at a lotto rate depending on the mined percents. This would eventually cause a work spurt based on the expected gain. The ruling controlling the energy expenditure versus the archival cost could be integrated with the wallet stagnation (into the void) by setting a wallet timeout of the order of many years.

A form of lotto inheritance for the collective data duplication cost of historic irrelevance. A super computational only to be taken on by the supercomputer of the age. A method therefore of computational research as it were, and not something for everybody to do, but easy for everybody to check as they compact.

An Open Standard for Large Event COVID Passports?

The POX Algorithm RFC. How to show an auth token when you have privacy but no booking or other door duty. The phone occluded xenomorph algorithm. A complex cypher to protect data at all points in transmission. What really gets shown is an event-specific checksum verify on some encrypted data with can be further queried by a provider (such as the NHS) to obtain validity and scope for event purpose on a statistical check basis to reduce server traffic load and focus on hot areas.

At 2953 bytes of data capacity in a QR barcode (23624 bits) there is enough scope for a double signature and some relevant data in escrow for falsification auditing. The following data layers are relevant with keys in between.

  • Verify credential entry VCE (the blind of public record customs inquiries)
    • validity decrypt key (event private key part) VDK QR
  • Door event transit DET (the over the shoulder mutable) QR
    • event encrypt key (event public key) EEK QR
  • Phone independent ephemeral PIE (the for me check)
  • A public blockchain signed hashed issue SHI (the public record) QR
    • authority signature keys (the body responsible for a trace of falsifications)
    • hashed phone number key (symmetric cypher)
    • record blind key (when combined with the event private key part makes the effective private key. Kept secret from the event)
    • confidentiality key (database to publication network security layer)
  • Actual data record ADR (the medical facts)

Various keys are required but covering the QR codes needed is perhaps better.

  • The manager VDK QR (given to the door manager)
  • The issue SHI QR (given by the provider)
  • The event EEK QR (posted online or outside the event)
  • The entry DET QR (made for the bouncer to scan)

At the point of issue, there may be a required pseudo-event to check that all is working well. The audit provider or provider (such as the NHS) has enough data on a valid VCE to call the user and the event in a conference call. Does the credential holder answer to speak to an echoing bouncer? Does the provider send a text?

Nitro Bacon COVID Hypothesis

So it seems there is a larger fraction of ethnic dead in the actuary of covid in the UK, and it does not seem to be genetic. This leaves environmental causation. I posit that Bacon and other nitro curing salted meat products are eaten in a larger amount by the sections of populous recording a lower than average actuarial death rate.

The proposed mechanism of action for this effect is through lifting blood pressure by consumption of nitro curing salts and so effecting a partial closure of the ACE2 receptor such that the infection affinity of the covid spike protein is reduced.

Dietary intakes of at-risk population sectors include a reduced-sodium and processed meat intake as a medical diet and may indicate that further research is required on gathering salted preserve intake versus ICU outcome.

I am quite surprised that many apparently random statistics are not captured on the off chance that significance may be shown. It is hardly a problem for the central limit theorem to be applied when the actuary exceeds 100000.

A likely non-chatty and a few more dead seems a better telly for the masses or not? It an’t even cosmically possible to solve language puzzles theses days. A word starting with N and ending in G. can lead to a 24 hour Facebook ban. It could best be expressed by saying Obama didn’t have tits. I appealed, but luckily or not due to covid the bums are not on seats at this time or such gatekeeping. Maybe they all ironically died due to lack of nitro salts? I wonder if the pearly gates they may or may not love has a shoot to hell policy?

Still a few hours before I can create a Facebook group “Borg Unimatrix Thought Distribution Node” for maximal profit. Borg is likely offensive to the Borg as maximal entropy of algorithmic production would likely be higher on the list if elimination of the surplus to requirement individuals was not placed so high.

Medi-ochre and society as corruption lowers society, the leaders can’t help but choose from lesser options and become the pictures of their own making.

Xenozootic Virology

The limited but perhaps influential evidence that covid might have started as an unnoticeable viral cross infection into humans  (Italian smoker study and some Chinese ideals), may be responsible for the 1/3rd no symptom transmission, as it might be possible the Wuhan strain was just a mutation of the unnoticed base virus which became, even more, infectious and had a greater severity.

This knowledge might indicate an occluded outbreak which being of low infectivity and unnoticeable severity might have already travelled enough of the world to infect about 30% of the world’s population so providing some kind of cross-immunity with the Wuhan strain. For all that I know it could have started with some guy called Keith in Hackney Downs.

A backtrace on the per cent of nonsymptomatic in area density across the globe may have indicative potential on the origination of the Pangolin Mary coming into contact with the occluded strain. Although factoring in the kissy romance of the Italian greeting would have to be used to normalize the neutral expectation of transmission under occlusion along with other societal locale idioms. Such things would potentially affect the nonsymptomatic occluded rate compared to the covid hospitalization rate, and hence be estimable to some extent.

The study of R0 unbiased via lockdown percolation along with critical actuarial induction of lockdown would lead to likely numbers on the binding affinity … blah, redacted. **** ****** ** …

I mean like 30% might be one of those 30/70 behaviourisms via some genetic activation, providing pre-MHC preferential or J section locations of activity.

Free Form Thoughts

A Classic Movie Voice Over

And so did the cutter of stone from the sky release the priest of his knowledge of lack of contact such that a stone cold comparison could be seen, and such that it meant that he still would still not know a hug.

And it became decided that the balance between overtaking the lessers versus timed up greaters as an order for the taking sensing a taked in the mistook, all because analytic in speed of absorption, such that little to as much was done.

How to tell the apprentice from beyond thu execution and what of the touchy humours?

And as the unity lowered with the cut words “different cutter” as they appeared. From this a division of opinion ended in more than a happen-seat. And so it was and might is a mighty word.

The multi-cutural (noel) was seen perhaps ower to the hives of man and fortuatous gods or sub-gods. Then what could be done? Why would they prey upon an idol god for it was upon the nature of being that action did perform some or a difference upon tribes and detribulates. If the payment is freedom then what is it to be holden to a duty?

Bode, bode and thrice bode that minus one is a bitch. Obvious dick in womb joke and all. All bar one off course. Yes, an extra-oneous F. Rise again dear cheapo.

And as he placed ring finger of his fishy right hand upon the pre-chopped and processed tree stump, declaring “take it and fuck off”, all was a bit more cagey and costing of those that never get told of the prices of alternate labour avoidance for profit.

Nice story so far dear observer. I think you’d like a little titillation for your money now. Bring forth babe percents and vital statistics.

What a placement of mind in such a being of knowledge. What could become? What it for removals of of thing never cast, never worried, never done.

In the be ginning. A shrrod ploy to an ends. As all became seated and thrust needed no explanation.

All the Too Messy for Sci-Fi Complaints

Assuming GPT-3 is really good at story completion how can anyone say that errors in word sequencing are irrelevant for the provocation phrase issued to an AI when the purpose is completion from the source through sense and not the generation of a more precise bore?

Although the mathematics of a form of complexity may be essential, the actual origin of the mathematics might not be as essential as a way of introducing the definitive emergents as one would assume. Multiple originations of emergence isomorphism in the completeness of behaviour might and likely are possible.

The latest AI joke is about the Silly can’ts versus the car bonned. Oh, dear. 

Gradients and Descents

Consider a backpropagation which has just applied to a network under learning. It is obvious that various weights changed by various amounts. If a weight changes little it can be considered good. If a weight changes a lot it can be considered an essential definer weight. Consider the maximal definer weight (the one with the greatest change) and change it a further per cent in its defined direction. Feedforward the network and backpropagate again. Many of the good weights will go back to closer to where they were before definer pass and can be considered excellent. Others will deviate further and be considered ok.

The signed tally of definer(3)/excellent(0)/good(1)/ok(2) can be placed as a variable of programming in each neuron. The per cent weight to apply to a definer, or more explicitly the definer history deviation product as a weight to per cent for the definer’s direction makes a training map which is not necessary for using the net after training is finished. It does however even further processing such as “excellent definer” detection. What does it mean? 

In a continual learning system, it indicates a new rationale requirement for the problem as it has developed an unexpected change to an excellent performing neuron. The tally itself could also be considered an auxiliary output of any neuron, but what would be a suitable backpropagation for it? Why would it even need one? Is it not just another round of input to the network (perhaps not applied to the first layer, but then inputs don’t always have to be so).

Defining the concept of definer epilepsy where the definer oscillates due to weight gradient magnification implies the need for the tally to be a signed quantity and also implies that weight normalization to zero should also be present. This requires but has not been proven as the only sufficient condition that per cent growth from zero should be weighted slightly less than per cent reduction toward zero. This can be factored into an asymmetry stability meta.

A net of this form can have memory. The oscillation of definer neurons can represent state information. They can also define the modality of the net knowledge in application readiness while keeping the excellent all-purpose neurons stable. The next step is physical and affine coder estimators.

Limit Sums

The convergence sequence on a weighting can be considered isomorphic to a limit sum series acceleration. The net can be “thrown” into an estimate of an infinity of cycles programming on the examples. Effectiveness can be evaluated, and data estimated on the “window” over the sum as an inner product on weightings with bounds control mechanisms yet TBC. PID control systems indicate in the first estimate that differentials and integrals to reduce error and increase convergence speed are appropriate factors to measure.

Dynamics on the per cent definers so to speak. And it came to pass the adaptivity increased and performance metrics were good but then irrelevant as newer, better, more relevant ones took hold from the duties of the net. Gundup and Ciders incorporated had a little hindsight problem to solve.

Fractal Affine Representation

Going back to 1991 and Micheal Barnsley developing a fractal image compression system (Iterrated Systems FIF file format). The process was considered computationally intensive in time for very good compression. Experiments with the FIASCO compression system which is an open-source derivative indicate best performance lies in low quality (about 50%) is very fast, but not exact. If the compressed image is subtracted from the input image and further compressed as a residual a number of times, performance is improved dramatically.

Dissociating secondaries and tertiaries from the primary affine set allows disjunct affine sets to be constructed for equivalent compression performance where even a zip compression can remove further information redundancy. The affine sets can be used as input to a network, and in some sense, the net can develop some sort of affine invariance in the processed fractals. The data reduction of the affine compression is also likely to lead to better utilization of the net over a convolution CNN.

The Four Colour Disjunction Theorem.

Consider an extended ensemble. The first layer could be considered a fully connected layer distributor. The last layer could be considered to unify the output by being fully connected. Intermediate layers can be either fully connected or colour limited connected, where only neurons of a colour connect to neurons of the same colour in the next layer. This provides disjunction of weights between layers and removes a completion upon the gradient between colours.

Four is really just a way of seeing the colour partition and does not really have to be four. Is an ensemble of 2 nets of half size better for the same time and space complexity of computation with a resulting lower accuracy of one colour channel, but in total higher in discriminatory performance by the disjuction of the feature detection?

The leaking of cross information can also be reduced if it is considered that feature sets are disjunct. Each feature under low to non detection would not bleed into features under medium to high activation. Is the concept of grouped quench useful?

Query Key Transformer Reduction

From a switching idea in telecommunications, an N*N array can be reduced to a mostly functional due to sparsity N*L array pair and an L*L array. Any cross-product essentially becomes  (from its routing of an in into an out) a set of 3 sequential routings with the first and last being the compression and expansion multiplex to the smaller switch. Cross talk grows to some extent, but this “bleed” of attention is a small consideration given the fact that the variance spread of having 3 routing weights to product up to the one effective weight and computation is less due to L being a smaller number than N.

The Giant Neuron Hypothesis

Considering the output stage of a neuronal model is a level sliced integrator of sorts, the construction of RNN cells would seem obvious. The hypothesis asks if it is logical to consider the layers previous to an “integration” layer effectively an input stage where the whole network is a gigantic neuron and integration is performed on various nonlinear functions. Each integration channel can be considered independent but could also have post layers for further joining integral terms. The integration time can be considered another input set for per integrator functional.  To maintain tensor shape as two inputs per integrator are supplied the first differential would be good also especially where feedback can be applied.

This leads to the idea of the silicon conectome. Then as now as it became, integration was the nonlinear of choice in time (a softmax divided by the variable as goes with [e^x-1]/x. A groovemax if you will). The extra net uninueron integration layer offering the extra time feature of future estimation at an endpoint integral of network evolved choice. The complexity of backpropagation of the limit sum through fixed constants and differentiable functions for a zero adjustable layer insert with scaled estimation of earlier weight adjustment on previous samples in the time series under integration for an ideal propergatable. Wow, that table’s gay as.

This network idea is not necessarily recursive, and may just be an applied network with a global time delta since last evaluation for continuation of the processing of time series information. The actual recursive use of networks with GRU and LSTM cells might benefit from this kind of global integration processing, but can GRU and LSTM be improved? Bistable cells say yes, for a kind of registered sequential logic on the combinationals. Consider that a Moore state machine layout might be more reductionist to efficiency, a kind of register layer pair for production and consumption to bracket the net is under consideration.

The producer layer is easily pushed to be differentiable by being a weighted sum junction between the input and the feedback from the consumer layer. The consumer layer is more complex when differentiability is considered. The consumer register really could be replaced by a zeroth differential prediction of the future sample given past samples. This has an interesting property of pseudo presentation of the output of a network as a consumptive of the input. This allows use of the output in the backpropergation as input to modify weights on learning the feedback. The consumer must be passthrough, in its input to output while storage of samples for predictive differential generation is allowed.

So it’s really some kind of propergational Mealy state machine. A MNN if you’d kindly see. State of the art art of the state. Regenerative registration is a thing of the futured.

Post-Modern Terminal CLI

As is usual; with all things computing, the easy road of bootstrap before security is just an obvious order of things. It then becomes a secondary goal to become the primary input moderation tool such that effective tooling brings benefits while not having to rely on the obscurity of knowledge. For example a nice code signature no execution tool where absolutely no code even becomes partially executed if the security situation indicates otherwise.

A transparent solution is a tool for development which can export a standard script to just run within today’s environment. As that environment evolves within the future it can take on the benefits of the tool, so maybe even to the point of the tool being replaced purely by choice of the user shell, and at a deeper level by a runtime replacing the shell interpreter at the system level.

The basic text edit of a script at some primary point in the development just requires a textual representation, a checksum in the compiled code which is in a different file and a checksum to allow a text override with some security on detecting a change in the text. This then allows possible benefit by a recompile option along with just a temporary use of the textual version. It won’t look that hard in the end with some things just having a security rating of “system local” for a passing observer.

ANSI 60 Keyboards? And Exception to the Rule?

More of an experiment in software completion. Jokes abound.

A keyboard keymap file for an ANSI 60 custom just finished software building. Test to follow given that cashflow prevents buy and building of hardware on the near time scale. Not bad for a day!

A built hex file for a DZ60 on GitHub so you don’t have to build your own with an MD5 checksum of 596beceaa446c1f1b55ee5e0a738f1c8 to verify for duelling the hack complexity. EDIT: version 1.7.2F (Enigma Bool Final Release). Development is complete. Only bug and documentation fixes may be pending. 

It all stems from design and data entry thinking, and small observations like the control keys being on the corners like the thumbs to chest closeness of baby two-finger hackers instead of the alt being close in for the parallel thumbs of the multi-finger secretariat.

The input before the output, the junction of the output to our input. It’s a four-layer main layout with an extra for layers for function shift. Quite a surprising amount can be fit in such a small 60 keyspace.

The system allowing intercepts of events going into the widget yet the focus priority should be picking up the none processed outgoings. Of course, this implies the atom widget should be the input interceptor to reflect the message for outer processing in a context. This implies that only widgets which have no children or administered system critical widgets can processEventInflow while all can processEventOutflow so silly things have less chance of happening in the certain progress of process code.

Perhaps a method signature of super protected such that it has a necessary throws ExistentialException or such. Of course, the fact RuntimeException extends Exception (removing a code compilation constraint) is a flaw of security in that it should only have allowed the adding of a constraint by making (in the code compile protection against an existential) Exception extending RuntimeException.

Then the OS can automatically reflect the event unhandled back up the event outflow queue along with an extra event with a link to the child in, and an exposed list of its child widgets) to outflow. An OrphanCollector can then decide to still show the child widgets or not with the opportunity of newEventInflow. All widgets could also be allowed to newEventOutflowForRebound itself a super protected method with a necessary throws ExistentialException (to prevent injection of events from non administered. widgets).

An ExistentialException can never be caught in user code to remove the throws clause and use of super try requires executive privilege to prevent executive code from being loaded by the ClassLoader. It could run but in a lower protection ring until elevated.

Accounts Year End 2020

No trading this year, payments in by director to cover bank charges and web services. Quite a year of nothing much happening on the contract front. I think COVID has had a vicious effect on many companies capital, but as the company has no creditors, there is no worries of being up against the wall this year.

An Interpolation of Codecs into the ISO Network Model

  1. Paper
  2. (Media Codec)
  3. Symbols
  4. (Rate Codec)
  5. Envelope
  6. (Ring Codec) 3, 2 …
  7. Post Office
  8. (Drone codec)
  9. Letter Box
  10. (Pizza codec)
  11. Name
  12. (Index codec)
  13. Dear

Considering the ISO network model of 7 layers can be looked at as an isomorphism to a letter delivery with Paper being the lowest hardware layer and Dear being the application layer, there is a set of 6 codecs which transform layer to layer and so a more exacting 13 layer model is just as obvious given the requisite definitions.

There also would exist a Loop Codec which would virtualize via an application a container of a virtual hardware layer on which another stack of 13 could be founded.

23

The classic 3*4+1+1+4+(9-1)/2+[this one @23rd]+(9-1)/2. For a total of 27. The whole 163 and x^2-x+41 Technetium (+2) connection. Interesting things in number theory along with sporadic groups and J4 which is the only one with an ordered factor of 43 and an 11^3. Promethium at 61 is connected somehow maybe by 12 * 62 = 744 with something not doing the 10 “f-orbitals” thing, and 23 comes in on the uniqueness of factorization too along with 105.  Along with the 18 families of groups 26(or 27)+18 = 44(or 45) in cubic elliptic varieties of the discriminant.

26 letters in the alphabet plus space? Rocks with patterned circles on an island? Considering one of the 44 is the circle integer modulo ring with no “torsion” then there is kind of 43 bending varieties and some kind of dimension null over a double bend “cover” inclusion as a half factor of one of the main 18 sequence groups. Likely a deep connection to factor square-free “Mobius mu” and topological orientability.

Polynomial Regression Estimators

Consider a sampled sequence of n samples and an interpolation of order n. The sample sequence can be differentiated by backward and forward differences of all n samples to make a first differential sequence of n elements or more. This too has a polynomial fit. The polynomial can be integrated to make an order n+1 polynomial with a new constant which can be estimated by a regression fit of the n samples. This can then make an n+1 th estimation to show a fit ad infinitum. Weighting the regression error based on sample time locks more history and less prediction into the forecast but fits less on the predictive end. Opposite the forecast is based on a forecast not based on history. In between is a concept of optimal.

A genetic algorithm optimizing the weighting provides a fitness score based on future measured truth. The population spread acts as a Monte-Carlo and some selection for spreading entropy as well as future weight would input entropy flair for efficiency by the association of prediction clustering elimination and outlier promotion for risk estimates. An irony of population size and death by accounting in genetic algorithms weeds out some ”bum notes’ ‘ but “right on” in the ill computed silicon heaven (via Lobb’s theorem of truth by confirmed assumption). Hence an eviction cache as in silicon hardware. What measures the crash instability of markets in the recession local optimum?

Yes, I do imply logic machines are operating reality. I do not think all the machines use the same operator algebra. Some algebras survive, some do not. There is nothing in the closure complexity of efficient algebras supporting the accumulation of axioms as leisure free from a suppressed fight.

And Physics

The number of light bosons stems from the cyclotomic of 18 (divisors 1, 2, 3, 6, 9, 18 and new roots 1, 1, 2, 2, 6, 6) for 18 normal bosons (6 free ones as 18-12 [not fermion bound], sounds like some regular “found bosons”) and if the equality of the mass-independent free space view to zero is just an approximation to the reciprocal of a small oscillation then a differential equation for such is just scaled by units of Hz2 and having which would place the cyclotomy at 20 (divisors 1, 2, 4, 5, 10, 20 and new roots 1, 1, 2, 4, 4, 8) for 20 dark bosons perhaps? Or maybe it works inversely for reducing the cyclotomy to 16 (divisors 1, 2, 4, 8, 16 and new roots 1, 1, 2, 4, 8) or 16 dark bosons?

Or “free dark bosons” at a tally of 2 (or -2)? I think I used η with a floating ~ (tilde) to indicate this secondary oscillation. Fermi exclusion unique factor domain expansion? Non-unique compaction “gravity”?

What tickles my mind is the idea of 2 “ultra free dark bosons” as an idea. Put another way <<So this Pauli exclusion of fermions. If bosons (some of them as theoretical) confine and attach to fermions giving them a slightly less than expected Pauli exclusion when confined. Does this imply a kind of “gravity-like” force? If the bosons exist in a Q[√-23] field or do the “a de Moivre number and p is a prime number. Unique factorizations of cyclotomic integers fail for p > 23.” provide dark energy like effect as all below 24 have more Pauli exclusion of state due to lack of degenerate factorization of a 23 particle “super-force”?>>

But 20, and an inverse of the Hz2 (+2,-2) => (*Hz2,/Hz2) @ ex for something like 23 is the prime larger than 20 itself an essential behaviour encompassing number, and 23 also is the prime less than 24 itself another essential behaviour encompassing number. Most exclusive field of 23 and a totient amongst many. So like the disjoint 23 feedback being maximal presents the most of its dark influence on dark, dark influence for zero black kinda dark.

15015 and 255255 on the Beyond

The peaks within and without crossing the R0 of gain into implementation in reality. Comprehensive ring gates and information transport and regenerative bits held fast by tallies of entropy. Rings within subsets in later fields may we walk into shining bright with the power of imaticity may we move toward imagionics and theory of technologies.

So the Hz2 must have come from somewhere. Equality of something being equal to a constant over the angular energy. An intuit that something with higher angular energy is more E=mc2 massive and has a greater boson intensity of flux. This multiplies with the bosonic cyclotomics to field-scale them. To keep within the small constant η if it is not zero but oh so close to it (relatively tiny and could be plank’s but this is not proven), the fermionic mass-independent factor has to shrink in scale by reducing velocities, accelerations and jerks making it more certain in nature maintaining the constancy of η. True enough it could be a simplistic gamble on the nature of energy density, or it could just be more flexible in quadrature of complex phase lead and lag shift from zero while still being “fast and loose”.

Free42 Android App Longer Term

A very nice calculator app. I’ll continue to use it. What would I change? And would I change what I’d changed? A fork with extras began and is in development.

  • I’d have a SAVE and LOAD with load varieties (LOADYLOADZ, LOADT for register and all stack registers higher if all 4 stack items are not to be restored along with LASTX) depending on restoring the right stack pattern after a behaviour which makes for first-class user-defined functions. SAVE? would return how many levels of saving there are.
  • Perhaps variables based on the current program location (or section). A better way of reducing clutter than a tree, while accessing the tree would need a new command specifying the variable context. This would lead to a minimal CONTEXT to set the LBL style recall context and use the THIS to set this context as per usual but without the variable in context clutter. A simple default to change the context when changing program space ensures consistency of being. In fact, nested subroutines could also provide a search order for an outer context. THAT could just remove one layer of the context, or more precisely change the current to the one below on the call stack such that THAT THAT would get the second nesting context if it exists. LSTO helps a little.
  • Some mechanics for the execution of a series term generator which by virtue of a modified XEQG (execute generator), could provide some faster summation or perhaps by flags a product, a sum, a term or continued fraction precision series acceleration.
  • Differential (numeric) and integral (endpoint numeric multiple kinds and all with one implicit bound of zero for constant at zero) algorithms that I would not reimplement them 😀 as I would like a series representation by perhaps an auto-generated generator. So XEQG would have a few cousins.
  • Although Mathematica solving might not give %n inserts for parameterizing a solution for constants, this does not prevent XEQG doing a differential either side sampling at high order and reducing it geometrically for a series estimation of the exact value. In terms of integral an integral of x^n.f(x) where n goes to zero provides the first bit of insight into integrals as convergent sets of series, with an exclusion NonconvergentAreaComplex[] on Godelian (made to make a method of solve fail) differential equations (or parts thereof). Checking the convergents of the term supplied to XEQG and cousins allows for sensible errors and perhaps transforms to pre-operators on the term provider function. SeriesRanged[] (containing an action as a function) list of for the other parts, with correct evaluation based on value, and how does this go multivariate? Although this looks out of place, it relates to series solutions of differential equations with more complex forms based on series of differentials. The integral of x.f(x)/x by parts as another giver of two more generators. The best bit is the “integral” from such a form is just evaluated at one endpoint (maybe subtraction for definite integrals) and as they include weighted series can be evaluated often by the series acceleration of a small number of differentials of the function to be integrated. The differentials themselves can be evaluated often accurately as a series converging as the delta is geometrically reduced with the improvements in the estimates being considered as new smaller terms in the series. So an integral evaluation might come down to (at 9 series terms per acceleration) about 2*90 function invocations instead of depending on the Simpson’s rule which has no series weighting to “accelerate” the summation. Also, integration up to infinity might be a simpler process when the limits are separated into two endpoint integrals as the summation over a limit to an estimation of convergence at infinity would not need as many conditional test cases on none, both and either one. As I think integrals should always return a function with parametric implicit constants, should not differentials return a parameterized function by default boolean the possibility of retrieving the faded constants? An offsetable self-recovery of diminished offset generic? SeriesRanged[Executive[]][ … ] 
  • Free42 Android
  • Perhaps an ACCESS command for building new generators (with a need to get a single generated) with a SETG (to set the generator evaluating ACCESS) and  XEQG can become just a set of things to put in SETG “…” making for easy generators of convergents and other structures. GETG for saving a small text string for nesting functions might be good but not essential and might confuse things by indirection possibilities. Just having a fixed literal alpha string to a SETG is enough as it could recall ACCESS operators on the menu like MVAR special programs (and not like INPUT programs). XEQG should still exist as there is the SETG combiner part (reducer) as well as the individual term generator (mapper) XEQG used for a variety of functions. This would make for easier operator definition (such as series functions by series accelerations or convergent limit differentials by similar on the reduction of the delta) without indirect alpha register calling of iterates.
  • A feature to make global labels go into a single menu item (the first) if they are in the same program, which then expands to all in the current program when selected for code management.
  • +R for addition with residual returning that fraction of the X that was not added to Y being returned in the X register and the sum returned in Y. This would further increase precision in some algorithms.

Rationale (after more thought and optimization)

  • Restoring the stack is good for not having to remember what was there and if you need to store it. Requires a call stack frame connection so maybe SAVE? is just call stack depth and so not required. (4 functions). LOAD, SAVE with some placing old loaded X into the last X with two commands before LOAD is called USE to indicate a stack consumption effect after restore and MAKE to leave one stack entry next lowest as an output.
  • Although local variables are good, in context variables would be nice to see. Clutter from other contexts is avoided or at least placed more keystrokes away from the main variables. This would also be easier to connect to the call stack frame. (3 functions) CONTXTTHIS and THATRCL tries CONTEXT before the call stack program associated variables. No code spams variables into other namespaces. STO stores into its associated variable space. This ensures an import strategy. The .END. namespace can be considered an initial global space so the persistence of its content upon GOTO . . is useful so XEQ “.END.” should always be available.
  • INTEG and SOLVE could be considered operators, but with special variables.  Separation of the loop to reduce on from the map function makes more general summation functions possible given single term functions. It would be more general to have 3 commands so that the reducer, the mapper and the variable to map could be all set, but is that level necessary? Especially since in use, a common practice of setting the reducer and applying it to different maps seems more useful. But consistency and flexibility might have PGMREDPGMMAP and MAPRED “var” for generality in one variable, with ACCESS in the reducer setting the right variable before executing the mapping. (4 functions).
  • Addition residual is a common precision technique. (1 function) +R.
  • I’d also make SOLVE and INTEG re-entrant (although not necessarily to a nested function call (a function already used in call stack frames stack check?)) by copying salient data on process entry along with MAPRED where the PGMRED set function can be used again and so does not need a nested reused check.
  • As to improvements in SOLVE, it seems that detection of asymptotes and singularities confuses interval bisection. Maybe adding a small amount and subtracting a small amount move actual roots but leave singular poles alone swamped by infinity. Also, the sum series of the product of the values and/or gradients may or may not converge as the pole or zero is approached.
  • Don’t SAVE registers or flags as this is legacy stuff. Maybe a quadratic (mass centroid) regression, Poisson distribution and maybe a few others, as the solver could work out inverses. Although there is the inconsistency of stack output versus variable output. Some way of auto-filling in MVAR from the stack and returns for 8 (or maybe 6 (XYZT in and X subtracted out, and …)) “variables” on the SOLVS menu? Maybe inverses are better functionality but the genericity of solvers are better for any evaluation. Allow MVAR ST X etc, with a phantom SAVE and have MRTN for an expected output variable before the subtraction making another “synthetic” MVAR or an exit point when not solving (and solving with an implicit – RTN and definite integrals being a predefinition of a process before a split by a subtractive equation for solving)? It would, of course, need MVAR LAST X to maybe be impossible (a reasonable constraint of an error speed efficiency certainty). (5+1 menu size). Redefinition of many internal functions (via no MVAR and automatic solver pre and postamble) would allow immediate inverse solves with no programming (SOLVE ST X, etc., with no special SOLVE RTN as it’s a plain evaluation). This makes MRTN the only added command, and the extra ST modes on the SOLVE and also a way of function specification for inbuilt ones.  The output to solve for can be programmatically set as the x register value when PGMSLV is executed and remembered when SOLVE is used next.
  • Register 24 is lonely. Perhaps it should contain weighted n, Σy but no it already exists. Σx2y seems better for the calculation of the weighted variance. That would lead to registers 0 to 10 being fast scratch saves. The 42 nukes other registers in ALLΣ anyway and I’d think not many programs use register 24 instead of a named variable. I’d be happy about only calculating it when in all mode, as I never switch and people who do usually want to keep register compatibility of routines for HP-41 code. Maybe PVAR for the n/(n-1) population variance transforms although this is an easy function to write by the user. A good metric to measure what gets added. Except for +R which is just looping and temporary variables for residual accumulation with further things to add assuming the LAST Y would be available etc.
  • I’d even suggest a mode using all the registers 0 to 10 for extra statistical variables and a few of those reserved flags (flag 64). I think there is at least 1 situation (chemistry) where quadratic regression is a good high precision idea. This makes REGS saving a good way of storing a stats set. Making the registers count down from the stats base in this mode seems a good idea. The following would provide quadratic regression with lin, log, exp and pow relation mapping on top of it for a CFIT set of 8 along with the use of R24 above. An extra entry on the CFIT MODL menu with indicator  for that enablement toggle of the extra shaping and register usage (flag 64 set) with an automatic enable of ALLΣ. As the parabolic constant would not be often accessed it would be enough to store it and the other ones after a fit, not interfering with live recalculation so as to not error by assumption. It would, of course, change the registers CLΣ sets to zero. Flag 54 can perhaps store the quadratic fitting model in mode. Quadratic Regression details. Although providing enough information to manufacture a result for the weighted standard deviation, it becomes optimal to decide to add WSD or an XY interchange mode on a flag to get inverse quadratic regression. Which would provide 12 regression curve options. The latter would need to extend the REGS array. FCSTQ might be better as a primary command to obtain the forecast root when the discriminant is square root subtracted negative as two forecast roots would exist. The most positive one would likely be more real in many situations. Maybe the linear correlation coefficient says something about the root to use and FCSTQ should use the other one?
    • R0 = correlation coefficient
    • R1 = quadratic/parabolic constant
    • R2 = linear constant
    • R3 = intercept constant
    • R4  = Σx3
    • R5 = Σx4
    • R6 = Σ(ln x)3
    • R7 = Σ(ln x)4
    • R8 = Σ(ln x)2y
    • R9 = Σx2ln y
    • R10 = Σ(ln x)2ln y
  • Flags still being about on the HP-28S was unexpected for me. I suppose it makes me not want to use them. The general user flags of the HP-41 have broken compatibility anyway as 11 to 18 are system flags on the HP-42S. There would be flags 67, 78, 79 and 80 for further system allocations.
  • I haven’t look if the source for the execution engine has a literal to address resolver with association struct field for speed with indirect handled by a similar manner, maybe even down to address function pointer filling in of checks and error routines like in a virtual dispatch table.
  • If endpoint integrals provide wrong answers, then even the investigation into the patterns of deviation from the true grail summate to eventually make them right in time. A VirtualTimeOptimalIngelCover[] is a very abstract class for me today. Some people might say it’s only an analytical partial solution to the problem. DivergantCover[] as a subclass of IngelCover[] which itself is a list container class of the type IngelCover. Not quite a set as removing an expansive intersection requires an addition of a DivergentCover[]. It’s also a thing about series summation order commutativity for a possible fourth endpoint operator.
  • MultiwayTimeOptimizer[ReducerExecutive[]][IngelCover[MapExecutive[]][]] and ListMapExecutiveToReturnType[] and the idea of method use object casting. And an Ingel of classes replaced the set of all classes.
  • I don’t use printing in that way. There’s an intermediate adapter called a PC tablet mix. The HP-41 was a system. A mini old mainframe. A convenience power efficiency method. My brother’s old CASIO with just P1 and P2 was my first access to a computational device. I’m not sure the reset kind of goto was Turing complete in some not enough memory for predicate register branch inlining.
  • ISO 7 Layer to 8 Layer, insert at level 4, virtualized channel layer. Provides data transform between transmit optimally and compute optimally. Is this the DataTransport layer? Ingel[AutomaticExecutive[]][].
    1. Paper
    2. (Media Codec)
    3. Symbols
    4. (Rate Codec)
    5. Envelope
    6. (Ring Codec) 3, 2 …
    7. Post Office
    8. (Drone codec)
    9. Letter Box
    10. (Pizza codec)
    11. Name
    12. (Index codec)
    13. Dear
  • Adding IOT as a toggle (flag 67) command in the PRINT menu is the closest place to IO on the Free42. Setting the print upload to a kind of object entity server. Scheduling compute racks with the interface problem of busy until state return. A command CFUN executes the cloud functions which have been “printed”. Cloud sync involves keeping the “printed” list and presenting it as an options menu in the style of CATALOG for all clouded things. NORM (auto-update publish (plus backup if accepted), merge remote (no global .END.)) and MAN (manual publish, no loading) set the sync mode of published things, while TRACE (manual publish, merge remote plus logging profile) takes debug logs on the server when CFUN is used but not for local runs. Merge works by namespace collision of local code priority, and no need to import remote callers of named function space. LIST sets a bookmark on the server.
  • An auto QPI mode for both x and y. In the DISP menu. Flag mode on in register 67. Could be handy. As could a complex statistics option when the REGS array is made complex. It would be interesting to see options for complex regression. As a neural node functor, a regression is suitable for propagation adaptation via Σ+ and Σ- as an experiment into regression fit minimization.

Minecraft Mod Development

Got distracted. It will work out fine as I move backwards and forwards between the this and DSDev. Version 1.16 of Minecraft now has a Forge to make mods. I had an idea and started on a mod. There are a lot of changes since I last dipped a toe in the water. It looks as though it will make many things easier.

After a little track into crafting recipes, simple potion time extensions and extra Redstone blocks, I could move onto more complex potions with new effects or maybe even mobs. Mobs are however less likely than other gameplay elements.

  • Enchantments – nice but seems like a stats and number modification game.
  • Mobs – quite a lot already, but mob AI looks like an interesting thing to improve.
  • Crafting – best with new blocks with uses.
  • Brewing – similar to enchantments but can have player Effects and there is much scope with Think Potion or Mundane Potion expansion.
  • Non Block Items – could have use (Food, Item Frames …) but would have to have utility and not just another thing to be of interest. I’m adding a Written Book for example.
  • Block Items – there are already many, but I find adding to Redstone blocks an interesting one for mechanisation and automation. Maybe new technology is possible? I remember writing a teleport chest a long while ago and now the Ender Chest has some of the same functionality but is better.

Nice after getting used to things like @ObjectHandler and other new things. Still a few assumptions in the documentation such as default loot tables for blocks. 

 

DSi Homebrew

I decided to start some DSi homebrew as a little fun project. Just looking into it it seems I can do a GL2D screen and a console with a keyboard quite easy. And then a little audio.

With a 128 kB texture, it looks possible to have about 512 (16 * 16) glyphs on the GL2D layer in 256 indexed colours. That should be good enough to start with. I suppose I’ll find out how to use multiple VRAM banks.

At the moment I’m stuck on this SD card not being recognized with various formatting. So I’ll have to get a nice new class 10 original one to check.

Open Code for an obvious game to come. Next to look at some auto animation and some 3D models for import.

So I’ve managed to work out somethings and got the memory pit overflow exploit working. The “unlaunch” installer does not install, and so it’s just keeping with the memory pit exploit whenever I need homebrew access.

There are still things to work out like why the MOD file does not run the next one, although this is more likely related to why the event loop only seemed to go through once. But that’s coding for another day.

So the generic menu is working. I’m still looking into why the switch back and forth between 2 and 3D on the main screen is resetting the image to magenta as it seems such things just set one register. A foreground sound automatic manner, and hooks into a game class seem to be logical next things to do.

This would make a game select something that could be placed in the options for maximizing utility of the 4MB limit. Finding a way of decompressing textures would also seem to free about 100kB, which is a lot in simple game designs.

To maximise sound utility, it might be possible to replace some of the sounds in the .mod files to be used in the game, as this seems to be possible to save on memory. I also must find out how to further reduce the file size of the .wav files.

So it seems I have about 500 kB for game logic and data excluding sound and graphics. I have defined various classes GameLogic (for generation), CTL (for control of the main machine loop), Audio (for triggering audio), BG (for background control) and Font (for 2D font overlays on the main display). This nicely abstracts the machine of all the setup and configuration.

Been doing graphics for a game idea. 8 by 8 is very tiny but fun. I seem to have 11 rows of 32 tiles left. I’m thinking of how to utilize this for best effect. I’m very likely to use genetic algorithms to make the AI effective. I have had some good ideas to abstract this into the enemy design.

DCS ASCII Map?

I think I might do a Ham radio licence. I’ve been thinking about it for a few weeks. It might be fun. I’ve been thinking of experimenting with using DCS squelch codes for data transmission of character streams. It should be possible using the 83 codes available with easy mapping.

023@ 114N 205r+ 306lf 4110 503: 703sp
025A 115O 223r- 311′ 4121 506; 712!
026B 116P 226g+ 315( 4132 516< 723″
031C 125Q 243g- 331) 4233 532= 731£
032D 131R 244b+ 343+ 4314 546> 732$
043E 132S 245b- 346, 4325 565? 734%
047F 134T 251up 351- 4456   743^
051G 143U 261dn 364. 4647   754&
054H 152V 263le 365/ 4658    
065I 155W 265ri 371\ 4669    
071J 156X 271dl        
072K 162Y          
073L 165Z          
074M 172*          
  174#          

 

 

This would be easy to integrate into a multipurpose app to connect on digital modes for a low bandwidth 300 baud signal at 23 bits per character. This would be quite reliable as a means of doing a more modern RTTY. Just leaves ` _ | and ~ in base ASCII to do later, with 20 (11-9) codes “free”. The 2xx and the 6xx lines. This gives the printable 63, and the 20 control characters with no print, along with a special control for inclusion in printing (dl for delete correction) for 83.

So the 2xx codes (non-destructive locators except “delete” the anti-time locator) are colour saturation and direction control with delete (which correction “time” dynamics perhaps in a 6-bit code), and the 6xx codes are where more complex things happen. A basis repetition rate for distance starts and the coding uses this as a basis to transmit on. So a basis of 16 repetitions means each symbol is sent 16 times, for a 1/16 data rate. 612 uses 2^n repetitions based on a log for the number of rp after the symbol to be repeated. 2, 4, 8, 16 … after rp, rprp,rprprp … 662 returns to a maximum basis of repetitions and attempts to reduce to keep the number of 627 messages down.

The basis and the use of 612 might lead to a 662 if the decoder is not in synchronization with respect to the basis of repeats. This basis is ignored on the higher-level code and is just a summation of noise to increase S/N by the symbol repetition.

606 sy – synchronous idle 
612 rp – repetition of x[rp]x or x[rp]x[rp]xx (7)
624 ra – rep acknowledge all reps in RX in TX
627 re – rep acknowledge with err correct as 624
631 ri – rep basis increase request (2*)
632 rd – rep basis decrease request (2/)
654 ok – accept basis repetition count by request
662 un – unsync of repetition error reply (max)
664 cq – followed by callsign and sy termination

This allows for a variable data distance at a constant rate especially if the RX has a sampling of code expectation and averaging over the number of symbol reps. It also synchronizes the start of many DCS codes but would reduce the speed of lock to need the code aligned.

Extended codes could be used to extend the coding to include other things. This is not necessary, and 83 symbols are enough. This is a good start, and extras are fine though. Even precise datarate coding lock would give better performance over DX at high repetition basis.

A modified form of base64 encoding along with digital signatures (El Gamel?) could provide good binary 8-bit transmission, and block reception good certainty. A return of the good signature or the false signature on error makes for a good block retransmit given a simplex window size of 1. In this case, synchronous idle would be a suitable preamble, and the 2xx and 6xx codes would be ignored as part of the base64-esque stream (except 606 for filling in empty places in the blocks of 5 in the base64 code).