DCS ASCII Map?

I think I might do a Ham radio licence. I’ve been thinking about it for a few weeks. It might be fun. I’ve been thinking of experimenting with using DCS squelch codes for data transmission of character streams. It should be possible using the 83 codes available with easy mapping.

023@ 114N 205r+ 306lf 4110 503: 703sp
025A 115O 223r- 311′ 4121 506; 712!
026B 116P 226g+ 315( 4132 516< 723″
031C 125Q 243g- 331) 4233 532= 731£
032D 131R 244b+ 343+ 4314 546> 732$
043E 132S 245b- 346, 4325 565? 734%
047F 134T 251up 351- 4456   743^
051G 143U 261dn 364. 4647   754&
054H 152V 263le 365/ 4658    
065I 155W 265ri 371\ 4669    
071J 156X 271dl        
072K 162Y          
073L 165Z          
074M 172*          
  174#          

 

 

This would be easy to integrate into a multipurpose app to connect on digital modes for a low bandwidth 300 baud signal at 23 bits per character. This would be quite reliable as a means of doing a more modern RTTY. Just leaves ` _ | and ~ in base ASCII to do later, with 20 (11-9) codes “free”. The 2xx and the 6xx lines. This gives the printable 63, and the 20 control characters with no print, along with a special control for inclusion in printing (dl for delete correction) for 83.

So the 2xx codes (non-destructive locators except “delete” the anti-time locator) are colour saturation and direction control with delete (which correction “time” dynamics perhaps in a 6-bit code), and the 6xx codes are where more complex things happen. A basis repetition rate for distance starts and the coding uses this as a basis to transmit on. So a basis of 16 repetitions means each symbol is sent 16 times, for a 1/16 data rate. 612 uses 2^n repetitions based on a log for the number of rp after the symbol to be repeated. 2, 4, 8, 16 … after rp, rprp,rprprp … 662 returns to a maximum basis of repetitions and attempts to reduce to keep the number of 627 messages down.

The basis and the use of 612 might lead to a 662 if the decoder is not in synchronization with respect to the basis of repeats. This basis is ignored on the higher-level code and is just a summation of noise to increase S/N by the symbol repetition.

606 sy – synchronous idle 
612 rp – repetition of x[rp]x or x[rp]x[rp]xx (7)
624 ra – rep acknowledge all reps in RX in TX
627 re – rep acknowledge with err correct as 624
631 ri – rep basis increase request (2*)
632 rd – rep basis decrease request (2/)
654 ok – accept basis repetition count by request
662 un – unsync of repetition error reply (max)
664 cq – followed by callsign and sy termination

This allows for a variable data distance at a constant rate especially if the RX has a sampling of code expectation and averaging over the number of symbol reps. It also synchronizes the start of many DCS codes but would reduce the speed of lock to need the code aligned.

Extended codes could be used to extend the coding to include other things. This is not necessary, and 83 symbols are enough. This is a good start, and extras are fine though. Even precise datarate coding lock would give better performance over DX at high repetition basis.

A modified form of base64 encoding along with digital signatures (El Gamel?) could provide good binary 8-bit transmission, and block reception good certainty. A return of the good signature or the false signature on error makes for a good block retransmit given a simplex window size of 1. In this case, synchronous idle would be a suitable preamble, and the 2xx and 6xx codes would be ignored as part of the base64-esque stream (except 606 for filling in empty places in the blocks of 5 in the base64 code).

COVID-19

Business has not really been affected, and so I am still available for work, and sitting in isolation thinking about the things I want to set a direction.

It’s a nasty infection. Take care.

9+4+1+1+3*4=27 and a 9th Gluon for 26 Not

It still comes to mind that the “Tits Group gluon” might be a real thing, as although there seem to be eight, the ninth one is in the symmetry of self attraction, perhaps causing a shift in the physical inertia from a predicted instead of filled in constant of nature.

There would appear to be only two types of self dual coloured gluons needed in the strong nuclear force. As though the cube roots of unity were entering into the complex analysis that is within the equations of the universe.

9+4+1+1+3*4=27

The 3*4 is the fermionic 12 while the relativistic observational deviation from the abstract conceptual observation frame versus the actual moving observation particle provides for the cyclotomic 9+4+1+1 = 15 one of which is not existential within itself but just kind of a sub factor of one of the other essentials. Also it does point out a 3*5 that may also be somewhere.

Given the Tits Gluon, the number of bosons would be 14, which removing 8 for gluons, leaves 6, and removing 4 for the electrowek boson set would leave 2, and removing the Higgs, would leave 1 boson left to discover for that amount of complexity in the bosonic cyclotomic groups.

The fantastic implications of the 26 group of particles and the undelying fundementals which lead to strong complex rooted pairs, and leptonic pair set separation. Well, that’s another future.

Roll on the Plankon as good a name as any. The extension of any GUT beyond it would either be some higher bosonic cyclotomy or a higher order effect of fermions leading to deviation from Heisenburg uncertainty.

Up Charm Top
Down Strange Bottom
Electron Muon Tau
E Neutrino M Neutrino T Neutrino
H Photon W+
? Z0 W
Gluon Gluon Gluon
Gluon Tits Gluon Gluon
Gluon Gluon Gluon

Dimensions of Manifolds

The Lorentz manifold is 7 dimensional with 3 space like 1 time like and 3 velocity like, while the other connected manifold is 2 space like 1 time like, 2 velocity like and a dimensionless “unitless” dimension. So the 6 dimensional “charge” manifold has properties of perhaps 2 toroids and 2 closed path lines in a topological product.

Metres to the 4th power per second. Rate of change of a 4D spacial object perhaps. The Lorentz manifold having a similar metres to the 6th power per second squared measure of dimensional product. Or area per kilogram and area per kilogram squared respectively. This links well with the idea of an invariant gravitational constant for a dimensionless “force” measure, and a mass “charge” in the non Lorentz manifold of root kilogram.

Root seconds per metre? Would this  be the Uncertain Geometry secondary “quantum mass per length field” and the “relativistic invariant Newtonian mass per length field”. To put it another way the constant G maps the kg squared per unit area into a force, but the dimensionless quantity (not in units of force) becomes a projector through the dimensionless to force map.

GF*GD = G and only GF is responsible for mapping to units of force with relativistic corrections. GD maps to a dimensionless quantity and hence would be invariant. In the non Lorentz manifold the GMM/r^2 eqivalent would have in units of root kilogram ((root seconds) per metre), and GD would have different units too. Another option is for M to be quantized and of the form GM/r^2 as both the “charge” masses could be the same quantized quantity.

The reason the second way is more inconsistent with the the use of the product of field energies as the linear projection of force would give an M^2 over an r^2, and it would remove some logical mappings or symmetries. In terms of moment of inertia thinking, GMM/Mr^2 springs to mind, but has little form beyond an extra idea to test out the maths with.

W Baryogenisis Asymmetrical Charge

The split of W plus and minus into separate particle slots takes the idea that the charge mass asymmetry between electrons and protons can come from a tiny mass half life asymmetry. Charge cancellation of antiparticle WW pairs may still hold but momentum cancellation does not have to be exact, leading to a net dielectric momentum. Who knows an experiment to test this? A slight induced photon to Z imbalance on the charge gradient, with a neutrino emission. The cause of the W plus to minus mass ratio being a consequence of the sporadic group orders and some twist in very taught space versus some not as taught space or dimensionless expression of a symmetrically broken balance of exacts.

The observation of a dimensionless “unitless*” dimension being invariant to spacetime and mass density dilation. My brain is doing a little parallel axis theorem on the side, and saying 3D conservation of energy as an emerging construction with torsion being a dialative observable in taught spacetime.

Recent experiment of inertia of spin in neutrons provides a wave induction mechanism. Amplified remote observation of non EM radio maybe possible. Lenz’s law of counter EM cancellation may not apply. It is interesting. Mass aperture flux density per bit might be ok depending on S/N ratio. That reminds me of nV/root Hz. So root seconds is per root Hz, and nV or scaled Volts is Joules per mol charge, Z integer scale */ Joules, or just Joules or in Uncertain Geometry house units Hz. So Hz per root Hz, or just root Hz (per mol).

So root seconds per metre is per root Hz metre. As the “kilogram equivalent but for a kind of hypercharge” in the non Lorentz manifold perhaps. The equivalent of GD (HD) projecting the invariant to an actual force. By moving the dialative into GF and HF use can be made of invariant analysis. Mols per root Hz metre is also a possible QH in FHI = HDQHQH/R^2 the manifold disconnect being of a radius calculated norm in nature. A “charge” in per noise energy metre?

Beyond the Particles to the 18n of Space with a Tits Connection

Why I lad, it’s sure been a beginning. The 26 sporadic groups and the Tits group as a connection to the 18n infinite families of simple groups. What is the impedance of free space (Google), and does water become an increase or decrease on that number of ohms. Inside the nature of the speed of light at a refractory boundary, what shape is the bend of a deflection and what ohm expectations on the impedance to the progress of light?

Boltzmann Statistics in the Conduction of Noise Energy as Dark Energy

Just like ohm metres is a resistivity of the medium, it’s inverse being a conductivity in the medium, a united quantity relating to “noise energy or intensity” with a metres extra maybe an area over length transform of a bulk property of a thing. The idea a “charge” can be a bulk noise conductivity makes for an interesting question or two. Is entanglement conducted? Can qubits be shielded? Can noise be removed from a volume of space?

If noise pushes on spacetime is it dark energy? Is the Tits gluon connection an axion with extras conducting into the spacetime field at a particular cycle size of the double cover of the from 18n singular group which shall be known as the flux origin. 2F4(2)′, maybe the biggest communication opportunity this side of the supermassive black hole Sagittarius A*. 

The Outer Manifold Multiverse Time Advantage Hypothesis

Assuming conductivity, and locations of the dimensionally reduced holographic manifold, plus time relativistic dilation, what is the speed of light to entanglement conduction ratio possibilities?

As noise from entanglement comes from everywhere, then any noise directionality control implies focus and control of noisy amounts from differential noise shaped sources. Information is therefore not in the bit state, but in the error spectrums of the bits.

The inner (or Lorentz) manifold is inside the horizon, and maybe the holographic principal is in error in that both manifolds project onto each other, and what is inside a growing black remains inside, and when growth happens does the outer manifold completely get pushed further out?

A note on dimensionfull invariants such as velocities is that although they are invariant they become susceptible to environmental density manipulation where as dimensionless invariants are truly invariant in that there is no metre or second that will ever alter the scalar value. For example Planck’s constant is dimensionless in Uncertain Geometry house units.

So even though the decode may take a while due to the distance of the environmental entanglement and its influence on statistics, (is it a radius or radius square effect), the isolation of transmission via a vacuum could in principal be detected. Is there a relationship between distance, time of decode for relevance of data causality?

If the spectrum of the “noise” is detectable then it must have properties different from other environmental noise, such as being the answer to a non binary question and hopefully degenerative pressure eventually forces the projection of the counter solutions in the noise, allowing detection by statistical absence.

Of course you could see it as a way of the sender just knowing what had not been received, from basic entanglement ideas, and you might be right. As the speed of temperature conduction is limited by the speed of light and non “cool packed” atomic orbital occupancy as in the bulk controlled by photon exchange and not degenerative limits imposed by Pauli exclusion. A quantum qubit system not under vacuum of cooling does not produce the right answer, but does it statistically very slightly produce it more often? Is the calculation drive of the gating applying a calculative pressure to the system of qubits, such that other qubits under a different calculation pressure can either unite or compete for the honour of the results?

Quantum noise plus thermal noise equals noise? 1/f? Shott noise for example is due to carrier conduction in PN junction semiconductors, in some instances. It could be considered a kind of particle observation by the current in the junction which gets (or can be) amplified. I’m not sure if it is independent of temperature in a limited (non plasma like) range, but it is not thermal noise.

The Lambda Outer Manifold Energy in a More General Relativity

The (inner of the horizon) manifold described by GR has a cosmological constant option associated with it. This could be filled by the “gravitation of quantum noise conduction” symmetrical outer manifold isomorphic field with a multiplicative force (dark energy?) Such that the total when viewed in an invariant force measure picture is not complicated by the horizon singularities of the infinities from division by zero. Most notably the Lorentz contraction of the outer manifold as it passed through the horizon on expansion or contraction of the radius.

The radius itself not being invariant can not be cast to other observers to make sense, only calculated invariants (and I’d go as far to say dimensionless invariants) have the required properties to be shared (or just agreed) between observers without questions of relativistic reshaping. Communication does not have to happen to agree on this knowledge of the entangled dimensionless measure.

CMB Focus History

With the CMB assume a temperature bends due to density and distance from a pixel as a back step in time then becomes a new picture with its own fluctuations in density and hence bend to sum an action on a pixel for a earlier accumulation over pixels drifting to a bent velocity. Motion in the direction of heat moved further back in time. Anything good show up? Does the moment weight of other things beside an inverse square bend look a little different?

So as the transparency emission happened over a time interval, the mass should allow a kind of focus back until the opacity happens. Then that is not so much as a problem as it appears or not, as it is a fractional absorbtion ratio, and the transparency balance passed or crosses through zero on an extrapolation of the expectation of continuation.

Then there maybe further crossings back as the down conversion of the red shift converts ulta gamma into the microwave band and lower. The fact the the IF stage of the CMB reciever has a frequency response curve and that a redshift function maybe defined by a function in variables might make for an interesting application of an end point integral as the swapping of a series in dx (Simpson’s rule) becomes a series in differentials of the function but with an exponetial kind of weighting better applicable to series acceleration.

Looking back via a kind of differential calculus induction of function, right back, and back. The size of the observation appature will greatly assist, as would effective interpolation in the size of the image with some knowledge of general relativity and 3D distance of the source of the CMB.

To the Manifold and Beyond

Always fun to end with a few jokes so the one about messing with your experiment from here in multiple ways, and taking one way home and not telling you if I switched it off seems a good one. There are likely more, but today has much thought in it, and there is quite a lot I can’t do. I can only suggest CERN keep the W+ and W- events in different buckets on the “half spin anti-matter opposite charge symmetry, full spin boson anti-matter same charge symmetry as could just be any” and “I wonder if the aliens in the outer universe drew a god on the outside of the black hole just for giggles.”

 

Differential Modulation So Far

Consider the mapping x(t+1) = k.x(t).(1-x(t)) made famous in chaos mathematics. Given a suitable set of values of k for each of the symbols to be represented on the stream, preferably of a size which produces a chaotic sequence. The sequence can be map stretched to encompass the transmission range of the signal swing.

Knowing that the initial state is represented with an exact precision, and that all calculations are performed using deterministic arithmetic with rounding, then it becomes obvious that for a given transmit precision, it becomes possible to recover some pre-reception transmission by infering the preceeding chaotic sequence.

The calculation involved for maximal likelyhood would be involved and extensive to obtain a “lock”, but after lock the calculation overhead would go down, and just assist in a form of error correction. In terms of noise immunity this would be a reasonable modulation as the past estimation would become more accurate given reception time and higher knowledge of the sequence and its meaning and scope of sense in decode.

Time Series Prediction

Given any time series of historical data, the prediction of the future values in the sequence is a computational task which can increase in complexity depending on the dimensionality of the data. For simple scalar data a predictive model based on differentials and expected continuation is perhaps the easiest. The order to which the series can be analysed depends quite a lot on numerical precision.

The computational complexity can be limited by using the local past to limit the size of the finite difference triangle, with the highest order assumption of zero or Monti Carlo spread Gaussian. Other predictions based on convolution and correlation could also be considered.

When using a local difference triangle, the outgoing sample to make way for the new sample in the sliding window can be used to make a simple calculation about the error introduced by “forgetting” the information. This could be used in theory to control the window size, or Monti Carlo variance. It is a measure related to the Markov model of a memory process with the integration of high differentials multiple times giving more predictive deviation from that which will happen.

This is obvious when seen in this light. The time sequence has within it an origin from differential equations, although of extream complexity. This is why spectral convolution correlation works well. Expensive compute but it works well. Other methods have a lower compute requirement and this is why I’m focusing on other methods this past few days.

A modified Gaussian density approach might be promising. Assuming an amplitude categorization about a mean, so that the signal (of the time series in a DSP sense) density can approximate “expected” statistics when mapped from the Gaussian onto the historical amplitude density given that the motion (differentials) have various rates of motion themselves in order for them to express a density.

The most probable direction until over probable changes the likely direction or rates again. Ideas form from noticing things. Integration for example has the naive accumulation of residual error in how floating point numbers are stored, and higher multiple integrals magnify this effect greatly. It would be better to construct an integral from the local data stream of a time series, and work out the required constant by an addition of a known integral of a fixed point.

Sacrifice of integral precision for the non accumulation of residual power error is a desirable trade off in many time series problems. The inspiration for the integral estimator came from this understanding. The next step in DSP from my creative prospective is a Gaussian Compander to normalize high passed (or regression subtracted normalized) data to match a variance and mean stabilized Gaussian amplitude.

Integration as a continued sum of Gaussians would via the central limit theorem go toward a narrower variance, but the offset error and same sign square error (in double integrals, smaller but no average cancellation) lead to things like energy amplification in numerical simulation of energy conservational systems.

Today’s signal processing piece was sparseLaplace for finding quickly for some sigma and time the integral going toward infinity. I wonder how the series of the integrals goes as a summation of increasing sections of the same time step, and how this can be accelerated as a series approximation to the Laplace integral.

The main issue is that it is calculated from the localized data, good and bad. The accuracy depends on the estimates of differentials and so the number of localized terms. It is a more dimensional “filter” as it has an extra set of variables for centre and length of the window of samples as well as sigma. A few steps of time should be all that is required to get a series summation estimate. Even the error in the time step approximation to the integral has a pattern, and maybe used to make the estimate more accurate.

AI and HashMap Turing Machines

Considering a remarkable abstract datatype or two is possible, and perhaps closely models the human sequential thought process I wonder today what applications this will have when a suitable execution model ISA and microarchitecture have been defined. The properties of controllable locality of storage and motion, along with read and write along with branch on stimulus and other yet to be discovered machine operations make for a container for a kind of universal Turing machine.

Today is a good day for robot conciousness, although I wonder just how applicable the implementation model is for biological life all the universe over. Here’s a free paper on a condensed few months of abstract thought.

Computative Psychoanalysis

It’s not just about IT, but thrashing through what the mind does, can be made to do, did, it all leverages information and modeling simulation growth for matched or greater ability.

Yes, it could all be made in neural nets, but given the tools available why would you choose to stick with the complexity and lack of density of such a soulution? A reasoning accelerator would be cool for my PC. How is this going to come about without much worktop workshop? If it were just the oil market I could affect, and how did it come to pass that I was introduced to the fall of oil, and for what other consequential thought sets and hence productions I could change.

One might call it wonder and design dress in “accidental” wreckless endangerment. For what should be a simple obvious benefit to the world becomes embroiled in competition to the drive for profit for the control of the “others” making of a non happening which upsets vested interests.

Who’d have thought it from this little cul-de-sac of a planetary system. Not exactly galactic mainline. And the winner is not halting for a live mind.

UAE4ALL2 on Android with Amiga Forever

It works better than uae4arm when you have not much memory internally free as both the system and work drives can be on the SD card. It does involve making an extra System.hdf in a desktop tool and performing a copy <from> to <to> all clone after formatting the system disk as something named other than that e.g. Workbench so the copy works.

The directory for the Work directory can be copied off the Amiga Forever CD (which you own), and placed in the folder <StorageDevice>/Android/data/atua.anddev.uae4all2/files along with the System.hdf as the app only allows one of each and boot from one. It also seems to not allow some combinations, and a bare file system on the Work is better than the otherway round.

If you get the ROMs too from the CD, and place them in there, you get a purple boot screen, for some reason it needs a app emulation restart to use the disks in my configuration. The mouse is horrible, and so a little USB mini keyboard and trackpad combo is essential. You kind of have to have a bit of font imagination until you set the screen mode (which also needs a shutdown and restart).

QtAp Getting Better

So the app is getting better. The “interfaces” for the extensions have been defined for now, and just doing the last functions of UTF import to bring it up to the level of building the first view. The command menu has been roughly defined, and changes based on the view.

Qt so far is quite nice to use. I have found as an experienced C/Java coder, much of the learning curve is not so much finding the right classes, but the assumptions one has to make on the garbage collection and the use of delete. In some cases, it is obvious with some thought (local variable allocation, and automatic destruction after use), while in others not so (using a common QPlainTextDocument in multiple widgets and removing the default ones). Basic assumption says pointer classes have to be manually handled.

https://github.com/jackokring/qtap/blob/master/classfilter.h is a category filter based on an extensible bloom filter. The .c file is in the same directory.

N.B. It’s so funny that some “amazing” hackers can bring down this sub $10 server. Way to go to show off your immense skill. A logline 142.93.167.254 – – [19/Jan/2020:08:38:01 +0000] “POST /ws/v1/cluster/apps/new-application HTTP/1.1” 403 498 “-” “python-re$ … etc. I’m dead sure no such thing exists on this server. And the /wp-login automated port 80 hammering for services not offered.

But enough of the bad news, when something along the lines of maximal entropic feature virtualization sounds like something nice (or not). Who knows? What’s involved? Somekind of renormal on the mapping of k-means for a spread which is morphing the feature landscape to focus or diverge the areas to be interpreted?

QtAp Release v0.1.13

GitHub Pages

It’s not great, but quite a nice experience with undo/redo, and Git integration. I even added the translation engine as part of the release, but have done no actual translations. It’s a better app initially as it includes some features that will consume time to add to the example notepad app.

Also in the background there is quite a bit which has been done which is ready as soon as the app develops, such as the interception of the action bar such that right click can show hide which sections are visible, and this is saved as part of the restored geometry.

0.1.13 QtAp Releases and Development

The getting of the greying out of menus and the action buttons depending on state was a nice challenge to learn the signal slot methods Qt uses. The tray is automatically generated too depending on the calls to the addMenu function, and the setting of flags to indicate state response routing.

I’m likely to even build a JavaScript host in there for the user and add in some extras for it, as this seems the most obvious way of scripting in the browser era. There is also possibilities to build new views by QML and so allow some advanced design work under the hood, while maintaining a hybrid approach to code implementation.

I cheated quite a lot by having a dependency on Git and so SSH. I’m not sure I even need the socket interface as long as I do some proxy code in JS to move data to and from C/C++.

Moving on to adding features to the interface, and a command menu which has selections based on the current active view. This could be done by buttons, but actions are better as they have a better shortcut method, and easier automatic accesibility tool interfacing.

The icon set likely needs a bit of a spruce up, and matching with some sensible default. Maybe adding in the cancelation of a bash sequence so that anti-commands can be supplied in a list, and run if it makes sense to reverse. Maybe later, later.

EDIT: The overriding of a class when it is attached to a GUI is slightly complex. I found the easiest way (not necessarily the most efficient as it depends on how the autogenerated setupUi function saves memory when not executed. The super class needs a simple bool stopUi = false with the extending subclass just passing this second parameter as true and putting an if(!stopUi) execution guard before the super class ui->setupUi(this) call in the constructor.

This allows QObject(parent) to be replaced by superClass(parent) to inherit all the “interface”. There maybe other ways using polymorphism, but none as easy.

More Qt 5

So I have the notepad app working sufficiently today for thinking about the next steps in its expansion beyond some simple idiot proofing of the git handling. It should be very easy to add more views, and a view handler for abstract views of data processing of the text which is in the main view.

The stylesheet needs work, but it’s still a matter of following the default themes (at least on Linux), and then that should be fine. The mimetype handling perhaps needs some alteration, as .v files also seem to be text.

Now it’s on to making the view list work so as to generate alternate computed views from the input notepad. And yes a blank view is available but accessed via when saved state is changed and sets available calculations.

Qt 5 Little App

I’m starting to write a little Qt application based on the simple editor application example. So far it doesn’t do much, and I’ve just been planning and adding some menus. These include a view menu for looking at things differently and a sync menu which will hook into a git repository when completed.

It’s quite a nice little tool kit. Looking into getting the Android NDK development with it. I’ll update as I go. So I’ve got the Android kit working to show a simple demo app on an android phone. It works, but does hide the top bar if the app scrolls when the input device opens. It comes back if you touch the now black area where the menu dropdown is.

Apart from setting up the tool chain, that has to be the simplest android development code I’ve seen. Some of the features such as sharing the libraries do not work with the Qt 5.13 release, and so you should avoid trying them. It doesn’t really matter though for the development of a single canvas application or a simple data tool.

Apart from minor bugs Qt Creator is an excellent tool, and should be on more development systems for open source, and even for closed source, but that would require a paid Qt licence.

And so I’ve decided to extend and simplify the notebook example code to make some kind of editor and processor. It seems like it could be easy, and I’ll factor off some of the useful routines into a shared library. It might be good, or it might be just another also ran.

For some reason it does not offer 32 bit desktop tools, but I might be lucky with the Intel DPC++ compiler. It could be good with some proxy of the GUI.

After a delay today due to disk errors hidden needing an fsck -b <superblock> option to get the boot, and actually liking timeshift with the /opt and /tools menus excluded. And back up and running on the desktop.

So back on track today with some signals and slots to do some logic with the clipboard icons such as shutting them off when it would not be sensible to not have them greyed out. I think the first external form, or maybe an embedded widget might be for the settings.

That reminds me that I must add a menu for it. I suppose it is a base view, but it’s likely better if it is considered a special case. Now to have a test out of some other things, as I think Mint 19.3 is coming for christmas.

EDIT: Mint 19.3 is here, and working quite nice. The app now has a style sheet, but I need to pick up “the desktop” one and do a bit of search replace to manufacture one for Qt on the fly. A little bit more code is needed to idiot proof the basic functionality, and then I have to move onto loading in abstarct processed views for all the features.

Node.js Destroys Raspbian Desktop

So the attempt to install node.js from the repository (version 13), just goes mental and wants to destroy the Raspbian desktop. Very useless. So I’ve demoted to a Mint 18.1, and I am in the process of building the system up with a windows 10 VirtualBox and some tools I will be needing.

I think I’ll even have to make a test bed system for software I don’t trust enough. Apart from some long standing tools which work, I like trying some other ones. I wonder if I can get some development tools to build windows binaries on Linux? There is the fabulous MinGW (which may be useful for some fast C building without needing all the Visual Studio weight and hence bloat), but what else is available?

So far I’ve put VSCode (good for a Microsoft initiative), Visual Studio Community Edition (also quite good on the windows in virtual), FPC/Lazarus (a good generic compiler) and IntelliJ (a nice Android/Java tool with some good Flutter and dart). This should be enough to get on with some of the things I was getting on with before the Windows hypervisor issue. It reminds me of that time virtual box had to disable the microsoft virtual hypervisor, and I think they have been fighting since.

This is no excuse for not testing such a popular product with the windows core system. I don’t think the use of the disk clean up wizard to gain the space could have been involved, with more options selected, not this far down the windows time line. The only other options are few, and include a botchy “security services” spyware which has no good intent and isn’t that secret, or the fabulous industrial espionage giggle, with the laugh a minute goal of finding out something juicy.

Strange Dreams of 53 “Herbie”

So first and third. And some other images. It seems the sub-concious is a strange thing, built for purpose by a design evolved over “billenia”. Seems simple and yet so unconcious. Maybe it’s a real change for good. Not in a missionaries and karma pacifiers first with the financial weapons later, but more real, and worth it.

But maybe it’s related to the 1st boot operating system becoming the third boot one. As I had to use Linux Mint 19.1 to bootstrap an ISO burn of Raspbian x86, and then I’m going for a Windows 10 ISO for use in a virtual box machine for those occasional must haves. Luckyly I have had no major issues yet with things like Xilinx design tools, or even virtual box.

So after some ping and a little browse to the expected gateway, I don’t need to find how to do an auto-config “proxy” script as it will route as soon as the click happens to the faster internet. Which is good considering the number of downloads that I might have to do to fix up a system by Crimbo.

Let this be a never let Windows be the hypervisor lesson. As they at Microsoft have too many issues of ignoring such things as other installed and preferred hypervisors. Then new flood oneDrive with all your documents to get them deleted, or interfere with creation of new ones unless you upgrade or have a decent smidge of tech savvy.

The windows quality experience goes on and on. At least it is possible to get the usb0 device to work, although there is likely some dhcp option I have to enable for a default setup. Although a large percentage of the bandwidth is updates of windows and microsoft store apps. 

Windows November Crapware Update

Tis the season to extort the public for new PC hardware. And the November windows preview is in on the act. Prepare for a memory management exception and a boot loop where media will be required. It definitely is not undoing the changes.

Now to decide what to run coming up to Christmas. Maybe Linux has a few less bribable bibbuble feature additions. You may think this is harsh, but disambiguation of alpha release and preview release is not to be confused at such an “experienced” software organisation.

The user blame game under fitness for “operating” system purpose and provision of sub par malware (which even goes PXE boot without asking for some random jump to code one presumably thinks), M$ has software crap or IME bull.

X16 Millfork Progress

So the basic plot and tile map arrangements are made. Next up is the “open and parse” BMP file from where the currently emulated virtual SD Card will be. This will then allow me to get on with some simple graphics, a compressed large map format and joystick motion of a sprite around the map.

I’m now used to the syntax enough to go for a generic file open routine, and deal with any PETSCII encoding problems. I’ll have to check to see if I can get a list of the directory today. And yes it seems like device 1 is the device to use. The fact that cbm_file library is currently buggy is maybe based on the advice in the programmer’s handbook to use 255 as the auxilliary command, but who knows? It does seem like OPEN <X>, 1, <W(1)/R(0)> is the one, but then the close call crashes as the open failed. The read likely returns a fail code but “works” without a device error.

Millfork and the emulator are now upgraded, and I must try to see what the fixes are, as some breaking changes have been made. Still waiting on the role out of the VSCode plugin update and might have to manual install it from source. Still good for helping with the build and test though. It’s fun although a little frustration when off spec features are yet to be done.

Commander X16

A nice Commodore 64 almost compatible with extensions. Better graphics modes with 128kB of video memory, 512kB of paged system memory, 40kB of main memory, a good emulator, 8 channel FM sound and an FPGA graphics solution.

Good 6502 development tools are available and the ROMs are open source. You can buy nice USB keyboards to help support the project. The 8-bit guy does many YouTube videos on classic microcomputer technology. He has good support of the retro community and if it were not for a design constraint of using as much off the shelf older parts then it would be a single chip thing. But part of it is the hands on experience, as the emulator on a Raspberry Pi would work.

The physical hardware is still under construction. As an 8-bit system without an accelerator it is suitable for new code or conversions of old. The main goal is to prevent scrap by the off shelf requirements and make an easier to code for machine than the Pi. One person can understand it all given time.

I’m doing a factor analysis of the hardware docs now to check how I’d use it. I think a scan limited 512 by 448 graphics 4 bits per pixel mode is one I could use for games and other software development. I’m sure a nice palate of colour can be made for the index and offset constraints of the graphics.

It has 16 hardware sprites and more of the system is becoming finalized. The sound was recently tested and works after sorting out some 8MHz timing issues. Some IO is still in design but it is looking good.

Here’s a community site.

As far as tools to look into Millfork, cc65 and the open repositories such as the VSCode plugin are on my list. It’s all looking quite nice. Getting the directory prepared and a suitable subdirectory for the development of a great library and demo template is going to be handy. It can be kind of a clone of the emulator binary directory as this is where the default loading and saving takes place for the emulator.

This will also keep the emulator relevant for the demo, and updates will be easier to not have to fix things without a schedule as could happen if the “latest” emulator was the default. This also manages postponement of “ROM changes” sufficient for not getting sucked into regression or editing other source and having to wait for pull requests to be merged.

The CTRL+SHIFT+P combination for the VSCode command palette gets access to build commands for millfork with the right configuration. This onlt has to be done for the project once, and a blank main.mfk makes a blank main.prg and so it kind of is setup for further progress. Putting all the right things in the git repo helps to keep the path statement unmodified. The settings are enough to get started.

Here is where I intend to put my development tools and first developments as they become what they become. The low memory and low speed of the system makes for some interesting challenge of design and implementation approximations that will be full of creative potential. Nice!

Happy coding.

UPDATE 2019-10-30: Going quite well. The Millfork works well, and after some errors (simple if you know what the compiler is trying to do), mainly obvious for people who have done assembly. It does indicate that context parsing is done after a dead code tree. The next up is likely a bit of colour in the font, and some file loading of a bitmap.

A New Paper on Computation and Application

https://www.amazon.co.uk/Pipeline-Cache-Big-RISC-Computational-ebook/dp/B07XY9RSHH/ref=sr_1_1?keywords=pipeline+cache+big+risc&qid=1568807888&sr=8-1 is a nice paper on some computation issues, and eventually covers some politics and vitamin biochemistry. Not a fan? Still letting your biome let you shout at the bad people not feeding your hunger?

Shovel in the gammon all you want, and load it up with chips as a little survivor from ancient times takes advantage of the modern high carb diet and digs a hole for you.

Mainly Bio this Week

Facinating stuff. Excellent vitamin B1 experience. I think something in my mind speaks in an abstract way, and the “normal” part is left to decide it as best when minus the paranoic instafill that would be necessary to help immediately. It’s really visual. A fast set of about 4 images per second.

Ok, Thiamine B1 and cod liver oil confirmed ok for intelligent fat people to regain composure.

If you don’t understand the sophisticated dry sense of humour in the use of the word intelligent, then perhaps you need a redose of the catalyst essential to human life that is known as B1. The malty desire and the addition to white carbs explain a lot. Do you vomit on too much beer these days? If you don’t you likely need to cut back on the malty flavour, and just add the very specific B1 in large 100mg amounts per day to your system until you reach the point of having enough and I understand the reaction is to vomit.

There we go, tiny starving japanese amounts of firelighter, as we wouldn’t want to extinguish the slim people with a reasonable dose. This victamin B1 could be considered to have quite a dark side.

It’s so important that an active transport mechanism picks it up fast in the illium, and alcohol does switch this off to a large extent if consumed at the same time, leading to high desire for the malty stuff, and some would say over amounts of some of the more toxic B vitamins. Some would say people shovel carbs to get B1 which frankly is added to staple carbs.

I did find my appitite for carbs drop significantly and my energy increased as ketosis and kinase action became easy again. I would say to be careful if you’ve been a hungering for B1 for a while as all manor of things switch on, and the dose of 100mg is about a week of Jap start, and not full on fatty full replacement.

So on to sulphur catalyst volcanic chemistry. As one of the few ways the body has of getting sulphur in an organic context vitamin B1 is very special. The other two known ways of obtaining organically relevant sulphur are through methionine and cystine amino acids. These are the other major sulphurs. As the body can make cystine from methionine it is not too critical, but performs the ancient critical function of structural sulphur to sulphur bonding in nails and hair and, and, and …

Methionine is maybe not needed in many protiens in large amount (although it theoretically could, but seems to be a newer addition to protein synthesis), but does form a protein rate expression regulator because it is a mRNA transcription initiator. Thus the amount does control to a good extent the speed of protien synthesis. B cell proliferation for example for antibody production. The biochemistry of the gut can make both these essentials (B1 and methionine) given sufficent organic sulphur precursors and an eggy fart smell is some unwanted bacteria making sulphides of the gas when the precious sulphur is needed. This phase is when the gas emissions start triggering problems for the local Treg cell population, and so the antibody arms are released.

Such sulphur should not be wasted as it prevents the efficient creation of B1 and methionine to absorb. The egg farts disappear and on to the sulphur system things go. You should experience a small apparent decrease in energy at this point, but some of that is due to up regulation of control on to newly freed cerebellum actions.

The transketolase test should perhaps be replaced with a white cell centrefuge and mass spec or NMR (odd? or context spin coupling?) sulphur test.

The possibility that part of the Mediterranean diet benefit is volcano eruption fall, in making the sulphur B vitamins is an interesting one, and needs some correlation data to sub region levels and not just whole countries for better matching. An involved process needing some extended geo data.

So a germ that eats B1? To induce carb consumption, which reduces the saladie goodness, which induces other deficiency and general ill health. Perhaps with a B12 deficiency later, a slim down and chronic.

Large B1 100mg dose to give them something to feed on, and go flat out eating it beyond their “insulin” (or whatever) threshold, and I suppose they excreeted too much hydrogen sulphide for that antibody attack and the cell content “unathorized sulphur chemistry” room clearing experience. I have more B1 in me but yet just the one fart happened on day five (iirc). You might forget the day but unlikely the experience.

Snoooozzzzzzzze

Sounds like a plan. Seems some playing with Caustic 3 android app is also needed. The tiredness which is not tiredness, more of a motor function initialisation deficit. Maybe I’ll get inspired on how to do things today. Maybe some beer. The knot keeps of South American Indians and amino acid chains? I think today has a parallels feel to it.

I suppose a visit to the shops would also generate things to go with this chicken. Which might be nice later. Does it have electrolytes? Maybe some more biochemistry videos and the facinating origins of how tryptophan came to be coded by a stop codon, and speculation on the recruitment of an extra essential amino acid.

Calculus

I don’t always get it wrong.

So it becomes a determined process to integrate. And as the two forms of integration closure are known, the process can be extended as any integration has closed form if the series converge. Integration by parts to a series. So why? The end points can have good integral estimates, and many in-between values of the function do not need evaluation. Series acceleration should be enough. Imagine an integral from zero to (m to power a times n to power b) which equals m times n. If for some a not equal b, the factor of m or n becomes obvious? The calculation would be log of the upper limit in polytime, not linear.

The previous page was:

Think about the f+c as integral of f plus a rectangle making f always positive when offset by c to give defined sign and hence binary search opportunity.

It wasn’t specifically developed to crack public key things, and the motivation was for simplified solutions to differential equations. Anyone who’s done DE solving knows the problem with them. That problem is integration and closing it to be algorithmic is a useful thing. That kind of leaves the Lambert W kind of collection of variables problem for real analytical DEs. Good.

It also sets a complexity limit on integration in terms of an analytic function and series of differential orders. The try a power series multiplied by ln x is seen as good advice, but lacking. Hypergeometric series can be reseen as useful to approach the series of this closure. It maybe helpful to decompose these closures into more fundamental sums of new special operators. And do some cancellation. If you find yourself pedantic about dx or plus C, then might I suggest you forget it and blunder on.

N-IDE Java on Android Fire 7

It looks so simple and efficient. I think git is missing but a simple Total Commander copy into a backed-up directory should be fine for now. It has the basics of Java SE and even can build android GUI apps. I think I’ll keep things console for now and put together some tools to do things I would like to do.

Seems to run a static main just fine. I wonder how it does with arm system libraries and JNI native calls. I don’t think I’ll use much of that, but it might get useful at some point. The code interface is ok, it’s quite lightweight and so does not fill the storage too much. Quite good for a simple editor with code completion and a simple class creation tool. Should do the job.

I think the most irritation will be the need to insert the method names to then do the top-down coding. Kind of obvious, as you can’t autocomplete an identifier without it being typed in the class anyway. But that’s ok as I’d be defining an expected class “interface” anyhow, and I’m not prone to worry too much about as yet unimplemented methods.

ES-64 Architecture (Open Hardware)

I’ve been looking into a native 64 bit architecture design of late ES-64 for the future of 64 as a default. The boot into 32 and 16 is history but frequently done. Trying to flip some design ideas on the head is one thing, but a central build repository is so easy these days. Easier than the VHDL. The aim is eventual code, but at the moment it’s a spreadsheet in PDF format, and an allocation space. Enjoy if you want to sell your hairdryer to the zero share landfill or paid recycling dedopter point.

The initial instruction set looks good for general coding and I decided to at the outset make a large number of opcodes be no operation, giving a certain way to expand to 32 and 48 bit opcodes. It’s inspired by the 68k but has a more RISCy feel. Most addressing modes were sacrificed to allow general operations on Word, Long and Quad as well as Float and Double. Bytes were not considered much apart from some Unicode helper instructions. The machine is word addressed.

A large part of the opcode space was opened up by sensible ideas about stability of certain operations on the PC. So a 20 register machine results with a lot of free opcode space, and a lot of reserved prefixes for things like vectors. A software model for simulation is likely before any VHDL.

The main focus on code density to open data cache bandwidth means some aspects of RISC have to be ignored. A memory to memory model is used instead of a load store model. This can be more dense for things like one off data loads, as the load and indirect are done in the same instruction without the extra bits in code. Quick literals are limited to 5 bits and come with a built in operation. This reduces register requirements and with general width operations 64 bit registers can easily split into 2 times 32 bit register halves or 4 times 16 bit register quarters. Most code will fit well, perhaps as 16 bit threaded code with a few virtual memory pages multimapped for common subroutines and a springboard for 32 and 64 bit subroutines.

The code generator might be more complex with bucket 64k assignment and routine factorization, but that is a task a machine can do well. There are reasonably efficient methods of code factoring to reduce binary size.

Amiga on Fire on Playstore

The latest thing to try. A Cleanto Amiga Forever OS 3.1 install to SD card in the Amazon Fire 7. Is it the way to get a low power portable development system? Put an OS on an SD and save main memory? An efficient OS from times of sub 20 MHz, and 50 MB hard drives.

Is it relevant in the PC age? Yes. All the source code in Pascal or C can be shuffled to PC, and I might even develop some binary prototype apps. Maybe a simple web engine is a good thing to develop. With the low CSS bull and AROS open development for x86 architecture becoming better at making for a good VM sandbox experience with main browsing on a sub flavour of bloat OS 2020. A browser, a router and an Amiga.

Uae4arm is the emulation app available from the Playstore. I’m looking forward to some Aminet greatness. Some mildly irritated coding in free Pascal with objects these days, and a full GCC build chain. Even a licenced set of games will shrink the Android entertainment bloat. A bargain rush for the technical. Don’t worry you ST users, it’s a chance to dream.

Lazarus lives. Or at least Borglaz the great is as it was. Don’t expect to be developing video realtime code or supercomputer forecasts. I hear there is even a python. I wonder if there is some other nice things. GCC and a little GUI redo? It’s not about making replacements for Android apps, more a less bloat but a full do OS with enough test and utility grunt to make. I wonder how pas2js is. There is also AMOS 2.0 to turn AMOS source into nice web apps. It’s not as silly as it seems.

Retro minimalism is more power in the hands of code designers. A bit of flange and boilerplate later and it’s a consumer product option with some character.

So it needs about a 100 MB hard disk file located not on the SD as it needs write access, and some changes of disk later and a boot of a clean install is done. Add the downloads folder as a disk and alter the mouse speed for the plugged in OTG keyboard. Excellent. I’ve got more space and speed than I did in the early 90s and 128 MB of Zorro RAM. Still an AGA A1200 but with a 68040 on its fastest setting.

I’ve a plan to install free Pascal and GCC along with some other tools to take the ultra portable Amiga on the move. The night light on the little keyboard will be good for midnight use. Having a media player in the background will be fun and browser downloads should be easy to load.

I’ve installed total commander on the Android side to help with moving files about. The installed BSD socket library would allow running an old Mosaic browser, or AWeb but both are not really suited to any dynamic content. They would be fast though. In practice Chrome and a download mount is more realistic. It’s time to go Aminet fishing.

It turns out that is is possible to put hard files on the SD card, but they must be placed in the Android app data directory and made by the app for correct permissions. So a 512 MB disk was made for better use of larger development versions. This is good for the Pascal 3.1.1 version.

Onwards to install a good editor such as Black’s Editor and of course LHA and some other goodies such as NewIcons. I’ll delete the LCL alpha units from Pascal as these will not be used by me. I might even get into ARexx or some of the wonderfull things on those CD images from Meeting Pearls or a cover disk archive.

Update: For some reason the SD card hard disk image becomes read locked. The insistent gremlins of the demands of time value money. So it’s 100 MB and a few libraries short of C. Meanwhile Java N-IDE is churning out class files, PipedInputStream has the buffer to stop PipedOutputStream waffling on, filling up memory. Hecl the language is to be hooked into the CLI I’m throwing together. Then some data time streams and some algorithms. I think the interesting bit today was the idea of stream variables. No strings, a minimum would be a stream.

So after building a CLI and adding in some nice commands, maybe even JOGL as the Android graphics? You know the 32 and 64 bit restrictions (both) on the play store though. I wonder if both are pre-built as much of the regular Android development cycle is filled with crap. Flutter looks good, but for mobile CLI tools with some style of bitmap 80’s, it’s just a little too formulaic.