Indeed, there are some kinks in the output that have to be sussed out. Before I go too far with the divisibility test algorithm I need to look at Treisaran's thread, some resources to see if I am not forgetting something.
Test of divisibility rule script on base 17. (I don't have all the regular tests written). Looks pretty good. Might change how some things are written (mdashes, etc.) (Updated 20151104)
It's been a tough slog but I have extended the code to generate regular rules. This segment is complicated because we have to generate a range. If that range is too long (and I used 18 terms as the maximum) it abbreviates it with ellipses. If the number x involves a power of 2 or 3 it could take advantage of range folding (e.g., decimal 4, 8; dozenal 8, 9). That was a tricky thing to get through and I almost quit. The program now handles all bases up to 36. I will extend it to any base as soon as I fix the digit converter.
I am still looking for glitches...this code took much longer to write than I ever thought and is more complicated than I considered.

Double sharpDozens Demigod
 Joined: 19 Sep 2015, 11:02
This was meant to be 2 and d, right? Earlier, your script writes "d if the tetradecimal digital root of x is divisible by d", so I'm assuming that all these numbers are supposed to be expressed in whatever base is being examined at the moment, not decimal.icarus @ Oct 31 2015, 02:08 PM wrote:
P.S. {930} looks gorgeous, thanks to the neighbouring divisors 30 and 31 creating a beautiful orange diagonal in the digit map.

icarusDozens Demigod
 Joined: 11 Apr 2006, 12:29
Double Sharp,
Yes! There are some validation items that need to happen. I do need to ensure that any figure in the prose is stated in base b EXCEPT parenthetical ones, which are decimal. (There will be a note to this effect).
I am holding on fixing the script to represent large bases (here meaning bases greater than 36, for no other reason than I have to append the next sequence of transdecimals beyond lowercase letters). Do I want to *still* be putting out these sorts of verbal descriptions, or just rely on the digit map? I lean toward YES. If that's so, then do we continue the baseb representation or cut it off? This effects rangefolding and the ranges stated for regulars. At some point we run out of transdecimals. I think at some point I don't give ranges / rangefolding but merely state that there are so many combinations. For bases greater than 60 divisor tests become cumbersome; stating ranges is going to suck up a lot of real estate. Going to need to ponder it, what are you's all's thoughts?
Today I worked on the "segment A" again. Pretty satisfied with it so far.
Segment A includes the names, prime decomposition, prime signatures, classes in which base b resides. I think it's nailed. The LaMadrid and SDN names are given, then the prime decomposition and prime signature. The script produces a brief range of numbers with similar prime signature and declares which OEIS number and if available, classes such as "sphenic numbers", "squarefree semiprimes", etc. I have yet to add "primorials" and other types of number independent of prime signature. Want to add primorials, HCN, SHCN, highly regular number, pronic number, etc. and will do that. Here are samples:
The prime cases should perhaps more simply read "p is prime, its prime signature is "1". The number p is term _primePi(p)_ in the sequence of primes is {2, 3, 5, ..., p, ...} (OEIS A000040)." This can be handled by an If statement that caters to primes. I think anything else can be treated in a more or less uniform way. [DONE 201511041718]
Also, hyperlinks to definitions need to be supplied as are in the tour. There are two separate references, one here and one at the home tobe. This will be added later. I think the hyperlinks are crucial so won't leave them out.
The OEIS hyperlinks can easily be made clickable because the site is so modular. (In fact I will do that now). [DONE 201511041718]
Next week I could lose focus due to business (At this point I would welcome that) but will resume when free time opens up again (feast or famine self employed).
I might tackle the generation of auxiliary bases if that is even possible. Will have to define criteria. Resolution of missing or ensuing primes is easy; what to do for bases that don't "need" it? This will favor resolution of the first three primes {2, 3, 5}, with preferred multiplicities {3, 1, 1} but this might require much thought. Again, what are your thoughts?
Yes! There are some validation items that need to happen. I do need to ensure that any figure in the prose is stated in base b EXCEPT parenthetical ones, which are decimal. (There will be a note to this effect).
I am holding on fixing the script to represent large bases (here meaning bases greater than 36, for no other reason than I have to append the next sequence of transdecimals beyond lowercase letters). Do I want to *still* be putting out these sorts of verbal descriptions, or just rely on the digit map? I lean toward YES. If that's so, then do we continue the baseb representation or cut it off? This effects rangefolding and the ranges stated for regulars. At some point we run out of transdecimals. I think at some point I don't give ranges / rangefolding but merely state that there are so many combinations. For bases greater than 60 divisor tests become cumbersome; stating ranges is going to suck up a lot of real estate. Going to need to ponder it, what are you's all's thoughts?
Today I worked on the "segment A" again. Pretty satisfied with it so far.
Segment A includes the names, prime decomposition, prime signatures, classes in which base b resides. I think it's nailed. The LaMadrid and SDN names are given, then the prime decomposition and prime signature. The script produces a brief range of numbers with similar prime signature and declares which OEIS number and if available, classes such as "sphenic numbers", "squarefree semiprimes", etc. I have yet to add "primorials" and other types of number independent of prime signature. Want to add primorials, HCN, SHCN, highly regular number, pronic number, etc. and will do that. Here are samples:
The prime cases should perhaps more simply read "p is prime, its prime signature is "1". The number p is term _primePi(p)_ in the sequence of primes is {2, 3, 5, ..., p, ...} (OEIS A000040)." This can be handled by an If statement that caters to primes. I think anything else can be treated in a more or less uniform way. [DONE 201511041718]
Also, hyperlinks to definitions need to be supplied as are in the tour. There are two separate references, one here and one at the home tobe. This will be added later. I think the hyperlinks are crucial so won't leave them out.
The OEIS hyperlinks can easily be made clickable because the site is so modular. (In fact I will do that now). [DONE 201511041718]
Next week I could lose focus due to business (At this point I would welcome that) but will resume when free time opens up again (feast or famine self employed).
I might tackle the generation of auxiliary bases if that is even possible. Will have to define criteria. Resolution of missing or ensuing primes is easy; what to do for bases that don't "need" it? This will favor resolution of the first three primes {2, 3, 5}, with preferred multiplicities {3, 1, 1} but this might require much thought. Again, what are your thoughts?

icarusDozens Demigod
 Joined: 11 Apr 2006, 12:29
The next segment (B) describes the type of numerals (digits) 0 <= m <= n (with 0 standing for 0 (mod rather than actual zero, and "10" not used). The kind and quantity of each numeral type, sets, idiosyncrasies (number of "practical" and "impractical", "alpha" or "omega" or "alphaomega" inheritors, etc.) are described. Primes {p  b and q coprime to b} <= b + 1 are described. There are pictorials that can be automatically generated thanks to flexCell. I'll work this out at a meeting (heh) and then maybe also gain ground on this side of the task.

Double sharpDozens Demigod
 Joined: 19 Sep 2015, 11:02
I think it would be mighty interesting to see lists of sexagesimal and centovigesimal divisibility tests. However, you raise a good point about stating ranges. Maybe at some point you'd shift to simply giving the rule without stating what digits are involved? For example, the rule for 2 in pure sexagesimal is that a number is even if it ends in one of {0, 2, 4, 6, 8, a, c, e, g, i, k, m, o, q, s, u, w, y, A, C, E, G, I, K, M, O, Q, S, U, W}. Hmm, quite a mouthful. But what if we just said "if the least significant place value of x is divisible by 2"? That would get the point of the test across, without drowning the reader in so many even digits. (Although maybe you do want to illustrate that point where even the divisor tests can reach exhaustion!)icarus @ Nov 4 2015, 11:02 PM wrote:Double Sharp,
Yes! There are some validation items that need to happen. I do need to ensure that any figure in the prose is stated in base b EXCEPT parenthetical ones, which are decimal. (There will be a note to this effect).
I am holding on fixing the script to represent large bases (here meaning bases greater than 36, for no other reason than I have to append the next sequence of transdecimals beyond lowercase letters). Do I want to *still* be putting out these sorts of verbal descriptions, or just rely on the digit map? I lean toward YES. If that's so, then do we continue the baseb representation or cut it off? This effects rangefolding and the ranges stated for regulars. At some point we run out of transdecimals. I think at some point I don't give ranges / rangefolding but merely state that there are so many combinations. For bases greater than 60 divisor tests become cumbersome; stating ranges is going to suck up a lot of real estate. Going to need to ponder it, what are you's all's thoughts?
I think we can safely use a range up to 62 with 09 az AZ, making pure sexagesimal available. My kana suggestion for bases up to 160 alas doesn't work quite as well as they are not contiguous in Unicode, so maybe we really have to avoid stating ranges for bases like centovigesimal, let alone trecentosexagesimal.
I think any base with a gapless prime factorisation like {2, 3} has a real problem with auxiliary bases. To get the factor 5, it goes into the vicious cycle I've mentioned several times already:icarus @ Nov 4 2015, 11:02 PM wrote:Today I worked on the "segment A" again. Pretty satisfied with it so far.
Segment A includes the names, prime decomposition, prime signatures, classes in which base b resides. I think it's nailed. The LaMadrid and SDN names are given, then the prime decomposition and prime signature. The script produces a brief range of numbers with similar prime signature and declares which OEIS number and if available, classes such as "sphenic numbers", "squarefree semiprimes", etc. I have yet to add "primorials" and other types of number independent of prime signature. Want to add primorials, HCN, SHCN, highly regular number, pronic number, etc. and will do that. Here are samples:
The prime cases should perhaps more simply read "p is prime, its prime signature is "1". The number p is term _primePi(p)_ in the sequence of primes is {2, 3, 5, ..., p, ...} (OEIS A000040)." This can be handled by an If statement that caters to primes. I think anything else can be treated in a more or less uniform way. [DONE 201511041718]
Also, hyperlinks to definitions need to be supplied as are in the tour. There are two separate references, one here and one at the home tobe. This will be added later. I think the hyperlinks are crucial so won't leave them out.
The OEIS hyperlinks can easily be made clickable because the site is so modular. (In fact I will do that now). [DONE 201511041718]
Next week I could lose focus due to business (At this point I would welcome that) but will resume when free time opens up again (feast or famine self employed).
I might tackle the generation of auxiliary bases if that is even possible. Will have to define criteria. Resolution of missing or ensuing primes is easy; what to do for bases that don't "need" it? This will favor resolution of the first three primes {2, 3, 5}, with preferred multiplicities {3, 1, 1} but this might require much thought. Again, what are your thoughts?
[z]
10 doesn't have 5. Multiply by 5!
50 blunts halves. Multiply by 2!
a0 blunts thirds. Multiply by 3!
260 blunts quarters. Multiply by 2!
500 blunts halves. Multiply by 2!
[/z]
And we're back in the same loop, an order of magnitude up. Whereas in decimal, this would resolve gracefully:
[d]
10 doesn't have 3. Multiply by 3!
30 blunts halves. Multiply by 2!
60 keeps the decimal factor {2} clean (we're torpedoing the notsoimportant {5}). First workable decimal auxiliary base: sixty.
In fact, this seems like a reasonable algorithm for auxiliary base generation. Take the first prime factor you don't have in your base, and multiply your base by it: this factor will gain prominence. (In decimal's case, 3.) Then reinforce the more important prime factors of the base. (In decimal's case, 2.) Now you will have the most important fractions dealt with cleanly. If you ever get stuck in a loop like for dozenal (or senary), there isn't a good auxiliary base available. But there are so many factors at play that it might not work 100% accurately. For example, users of a primepower base like octal or hexadecimal may be quite willing to torpedo high prime powers that are already factors of the base, as for example {3, 5} are more important than {8}.

icarusDozens Demigod
 Joined: 11 Apr 2006, 12:29
I have been interrupted a bit today.
At yesterday's meeting, I worked out some thoughts about auxiliaries.
Generally, and auxiliary base is simply a radix used as an aid to some application within the context of the "civilizational base" or the "base in play". This could really be any number. We use base 7 for weeks merely because there are seven heavenly wandering lights. Divisibility apparently didn't come into play. Wendy uses many of her bases as auxiliaries perhaps, maybe in the context of her use of 120. We use a mixed radix to represent yearsmonthsdays, etc. (but it is pretty shabby).
What we are really considering is a "divisional" (akin to "multiplicative") or "flexible" auxiliary base. (There must be a better word...)
We want to be able to divide a cyclical unit by a highly factorable base in order to avoid fractions.
In base 10 we have the flex aux bases {12, 60, 360} (among others, but these are predominant). These cover different ranges and "sacrifice" more of the decimalnative "flexibility" to take advantage of the aux base's.
In the case of 12 we completely abandon 5 and replace it with 6. Thus the natural fractions are neatly resolved {3, 4, 6, 8, 9} and we can tell time from across a room or on our wrists as we assault a beach to stop the moves of a despot. It's a rugged and coarse auxiliary base.
When we want more resolution we use decimal coded sexagesimal. Here we retain 5 and add the primes {2, 3} to get the following:
Retain the half (30) and pick up clean thirds (20, 40) and sixths (10, 50).
Retain quarters as blunt as decimalnative 2nd rank quarters (15, 45) vs. (25, 75).
Sacrifice the fifths (12, 24, 36, 48) and tenths (6, 18, 42, 54).
Apparently this works because we native decimalists aren't really using the larger prime 5 intrinsically, per se, but as a surrogate for the negative powers of 2. So we sacrifice 5 to pick up thirds and sixths and maintain halves and quarters. All in all, we really haven't lost anything, just muddied up the clearer fifths and tenths we would have. These fractions remain integers under base 60. Sevenths are completely unsupported.
We use 360 for even more resolution by adding {2^2, 3^2}. This broadens the clean fractions to include quarters and ninths, fifths and tenths. 90 is seen as "round" under 360 probably by dint of using it as a measure of a circle so often; it is the merger between circles and orthogonality.
In base 14 nothing argues against making the first deal. The dozen could be used just as we would in base 10. We might not use 60. The unit fractions of {2, 3, 4, 5, 6, a, c} end up being represented as {22, 16, 11, c, a, 6, 5} and that's unacceptably muddy. If we don't mind leaving out 5 and just garnering 3 without completely abandoning 7, maybe we'd use decimal 84 = "60" (heh). Then we'd show {2, 3, 4, 6, 7, a, c} as {30, 20, 17, a, 7, 6} rather analogous to decimal coded sexagesimal. If we want {3, 5} and plan to maintain 7 we need decimal 210 at minimum (poor because of poor quarters) but rather decimal 420, which is on par scalewise with decimal 360. Then {2, 3, 4, 5, 6, 7, a, c, 10} as {110, a0, 77, 60, 50, 44, 30, 27, 22}
Yes, generally we are filling in to get factorization something like Prepend[Union[{2, 3, 5, p}], 2], with p being any prime divisor of b, but it depends on whether we want to completely abandon p > 5, if we want to accept the fact that there is no 5, etc.
The above figure is the output of a diagram that illustrates the primesieving properties of a base's totatives. (This is decimal).
At yesterday's meeting, I worked out some thoughts about auxiliaries.
Generally, and auxiliary base is simply a radix used as an aid to some application within the context of the "civilizational base" or the "base in play". This could really be any number. We use base 7 for weeks merely because there are seven heavenly wandering lights. Divisibility apparently didn't come into play. Wendy uses many of her bases as auxiliaries perhaps, maybe in the context of her use of 120. We use a mixed radix to represent yearsmonthsdays, etc. (but it is pretty shabby).
What we are really considering is a "divisional" (akin to "multiplicative") or "flexible" auxiliary base. (There must be a better word...)
We want to be able to divide a cyclical unit by a highly factorable base in order to avoid fractions.
In base 10 we have the flex aux bases {12, 60, 360} (among others, but these are predominant). These cover different ranges and "sacrifice" more of the decimalnative "flexibility" to take advantage of the aux base's.
In the case of 12 we completely abandon 5 and replace it with 6. Thus the natural fractions are neatly resolved {3, 4, 6, 8, 9} and we can tell time from across a room or on our wrists as we assault a beach to stop the moves of a despot. It's a rugged and coarse auxiliary base.
When we want more resolution we use decimal coded sexagesimal. Here we retain 5 and add the primes {2, 3} to get the following:
Retain the half (30) and pick up clean thirds (20, 40) and sixths (10, 50).
Retain quarters as blunt as decimalnative 2nd rank quarters (15, 45) vs. (25, 75).
Sacrifice the fifths (12, 24, 36, 48) and tenths (6, 18, 42, 54).
Apparently this works because we native decimalists aren't really using the larger prime 5 intrinsically, per se, but as a surrogate for the negative powers of 2. So we sacrifice 5 to pick up thirds and sixths and maintain halves and quarters. All in all, we really haven't lost anything, just muddied up the clearer fifths and tenths we would have. These fractions remain integers under base 60. Sevenths are completely unsupported.
We use 360 for even more resolution by adding {2^2, 3^2}. This broadens the clean fractions to include quarters and ninths, fifths and tenths. 90 is seen as "round" under 360 probably by dint of using it as a measure of a circle so often; it is the merger between circles and orthogonality.
In base 14 nothing argues against making the first deal. The dozen could be used just as we would in base 10. We might not use 60. The unit fractions of {2, 3, 4, 5, 6, a, c} end up being represented as {22, 16, 11, c, a, 6, 5} and that's unacceptably muddy. If we don't mind leaving out 5 and just garnering 3 without completely abandoning 7, maybe we'd use decimal 84 = "60" (heh). Then we'd show {2, 3, 4, 6, 7, a, c} as {30, 20, 17, a, 7, 6} rather analogous to decimal coded sexagesimal. If we want {3, 5} and plan to maintain 7 we need decimal 210 at minimum (poor because of poor quarters) but rather decimal 420, which is on par scalewise with decimal 360. Then {2, 3, 4, 5, 6, 7, a, c, 10} as {110, a0, 77, 60, 50, 44, 30, 27, 22}
Yes, generally we are filling in to get factorization something like Prepend[Union[{2, 3, 5, p}], 2], with p being any prime divisor of b, but it depends on whether we want to completely abandon p > 5, if we want to accept the fact that there is no 5, etc.
The above figure is the output of a diagram that illustrates the primesieving properties of a base's totatives. (This is decimal).

Piotr
Remember to post base 54 in "Senary and {2, 3} bases" as 54 is 2 3Â³.Double sharp @ Oct 17 2015, 11:52 AM wrote: My intent would be to cover {40, 54, 56} if nothing else. Following that, I'd go to {44, 45, 50, 52}, and then the binary powers {32, 64}.

Double sharpDozens Demigod
 Joined: 19 Sep 2015, 11:02
Huh. So I see in the hidden comments to this thread the following planned tours: {2, 3, 4, 22, 26, 27, 32, 35, 45, 54, 420, 840, 1000, 1260, 1680, 1728, 2310, 2520, 5040}.icarus @ May 1 2012, 03:04 PM wrote:
Here are my overambitious considered additions: {50} as a larger version of 18, {64} to extend the binary powers, {71} as queen of the leeches, {105} as an odd sphenic number, and {168, 180, 192, 216, 252, 300, 336} to lead to the desolate landscape at 360! I'd also include {160}, another merge of decimal and hexadecimal, and {480, 960} as covered in one of Treisaran's threads (useful as auxiliaries to {8, 16}). I'd love finally also {945} as odd and abundant.
The coverage that would result: {222, 2428, 30, 32, 3436, 40, 42, 45, 48, 50, 5456, 60, 64, 7072, 80, 84, 90, 96, 99, 100, 105, 108, 109, 112, 120, 144, 160, 168, 173, 180, 192, 210, 216, 240, 252, 300, 336, 360, 420, 480, 720, 840, 945, 960, 1000, 1260, 1680, 1728, 2310, 2520, 5040}.
The ones remaining to be written:
{24, 22, 26, 27, 32, 35, 45, 54, 64, 71, 105, 160, 168, 180, 192, 216, 252, 300, 336, 420, 840, 945, 960, 1000, 1260, 1680, 1728, 2310, 2520, 5040}. Of these, I think many of the large ones have to be automated, so I'll concentrate on the lower range.
Duly noted, Piotr!

icarusDozens Demigod
 Joined: 11 Apr 2006, 12:29
When the numberbases site begins, I will likely start with "human scale" bases and flesh out the site from there. I'd started long ago on a glossary and will revisit it but won't let that undertaking sap the effort. There is a makeshift glossary that I will expand from. The first part of development will be a couple dozen base pages in the format: numberbases.com/tour/base12.html.
If you have early suggestions I'd appreciate them.
The first bases will be: {6, 8, 10, 12, 14, 16}, {20, 24, 30, 36, 60, 120}, {210, 2310}, {5, 7, 11}, {21, 34, 55, 56, 64, 76, 99}.
Most of them will just be charts at first.
If you have early suggestions I'd appreciate them.
The first bases will be: {6, 8, 10, 12, 14, 16}, {20, 24, 30, 36, 60, 120}, {210, 2310}, {5, 7, 11}, {21, 34, 55, 56, 64, 76, 99}.
Most of them will just be charts at first.

OschkarDozens Disciple
 Joined: 19 Nov 2011, 01:07
I'm going to suggest 42 and 84 together with the initial midscale and 9 and 15 as the humanscale composite odd bases.icarus @ Jan 18 2016, 11:35 PM wrote: If you have early suggestions I'd appreciate them.
The first bases will be: {6, 8, 10, 12, 14, 16}, {20, 24, 30, 36, 60, 120}, {210, 2310}, {5, 7, 11}, {21, 34, 55, 56, 64, 76, 99}.
I guess the next stage could be the small bases {2, 3, 4}, the superhuman scale {13, 18, 22, 28}, the remaining midscale 3smooth {48, 72, 96, 108}, the twodimensional {40, 56, 80, 112}, and the threedimensional {70, 90, 126, 168, 180}.
126 sounds a lot like a mirroruniverse version of 80, being a smallsmooth number right next to a power of the prime it's missing. The problem is that 126 has only a single binary power, so testing for 4 is awful...
By the way, it's Lamadrid, without the CamelCase.
Also, I love the new alpha2/omega2 colours, they make the divisibility maps that much more varied. (80 and 85 in base 55 are beautiful...)

Double sharpDozens Demigod
 Joined: 19 Sep 2015, 11:02
Please don't forget {240, 360} and the huge {2520, 5040}! I also think 54 deserves a place in there.

OschkarDozens Disciple
 Joined: 19 Nov 2011, 01:07
Going to add {1680} to the huge bases. I'm fairly sure that I can work in 12:10:14 a bit more easily than in 12:15:14, with all three subbases being even, but we lose the nice omega relationship with 11, which then becomes a maximal prime.Double sharp @ Jan 19 2016, 08:13 AM wrote: Please don't forget {240, 360} and the huge {2520, 5040}! I also think 54 deserves a place in there.
The fourplace ninths (0.134'8) don't bother me, really, but the sevenplace 27ths (0.046'269'4) do a little, and the only way to resolve it is through base 5040, whose 18:20:14 encoding seems like overkill even for tristaff bases, and whose best 4staff encoding is probably the relatively unbalanced 12:10:7:6.
However, Iâ€™m reasonably sure that if people were to actually use 12:10:14 as a finer replacement for percentages or something, rounding to three places would be enough to distinguish all relatively common fractions.

OschkarDozens Disciple
 Joined: 19 Nov 2011, 01:07
In the full map, we also need colours for compound squarealpha tests, like pentaquinquagesimal 51 (Î±Â²Ã—Ï‰), 68 (Î±Â²Ã—Î±) and 272 (Î±Â²Ã—Ï‰Â²). I'm going to suggest for these a medium blueviolet, about #865edb, and for things like pentaquinquagesimal 255 and 340 where they combine with the divisors a lavender like #da89fa for richness 1 and #e4a7fc for higher richness.

icarusDozens Demigod
 Joined: 11 Apr 2006, 12:29
Oschkar,
I'll look at that. I am running up against a browser sort of limit in terms of the number of colors we can display in tables in web pages. Of course, in Mathematica we can have any number of colors. So long as there is a mathematical manner of defining the category, I can show it.
One of the problems we might run into is, where do we draw the line in saying a test is "intuitive" or not? Now I think I have the ability to "shut off" items that are more obscure using bits. I would enable such tests using the fifth bit. If I "run out of bits" I can "borrow" the seventh (that which enables the regular figure categorization) since it nullifies the effect of most of the other bits. I could use a Boolean method to enable specialty tests to show.
Another problem is the humaneye ability to distinguish color. The colors you've chosen happen to be in a space we haven't made much use of yet.
On another note I just discovered a precategorizing method in which I can use MemberQ rather than factoring each number  I only have to factor once per base (!) (so far, not totally fleshed out) and can derive all the flavors of the chart. This will reduce the generation time enormously (currently it puts out base 2520 in about a half second). It will also simplify the code. I am surprised and embarrassed that I hadn't done this sort of step sooner.
I'll look at that. I am running up against a browser sort of limit in terms of the number of colors we can display in tables in web pages. Of course, in Mathematica we can have any number of colors. So long as there is a mathematical manner of defining the category, I can show it.
One of the problems we might run into is, where do we draw the line in saying a test is "intuitive" or not? Now I think I have the ability to "shut off" items that are more obscure using bits. I would enable such tests using the fifth bit. If I "run out of bits" I can "borrow" the seventh (that which enables the regular figure categorization) since it nullifies the effect of most of the other bits. I could use a Boolean method to enable specialty tests to show.
Another problem is the humaneye ability to distinguish color. The colors you've chosen happen to be in a space we haven't made much use of yet.
On another note I just discovered a precategorizing method in which I can use MemberQ rather than factoring each number  I only have to factor once per base (!) (so far, not totally fleshed out) and can derive all the flavors of the chart. This will reduce the generation time enormously (currently it puts out base 2520 in about a half second). It will also simplify the code. I am surprised and embarrassed that I hadn't done this sort of step sooner.

OschkarDozens Disciple
 Joined: 19 Nov 2011, 01:07
Youâ€™re probably right that compound squarealpha tests are probably not something that could be reasonably applied in most bases, especially ones like 55, but they might turn out to be intuitive in some small bases, say, testing for 15 in octal, where 5 comes from the squarealpha and 3 from the alpha itself. I donâ€™t think that the three types of compound squarealpha tests need to be distinguished in the charts, though.icarus @ Jan 20 2016, 12:08 AM wrote: One of the problems we might run into is, where do we draw the line in saying a test is "intuitive" or not? Now I think I have the ability to "shut off" items that are more obscure using bits. I would enable such tests using the fifth bit. If I "run out of bits" I can "borrow" the seventh (that which enables the regular figure categorization) since it nullifies the effect of most of the other bits. I could use a Boolean method to enable specialty tests to show.
Iâ€™m not sure if you want to reserve the purple range for things like 1 and 2 only, or if other relationships can fit in there somehow, but if they must be in a range other than purple, we can put them in the browns, distinguished from the regular figures by their saturation.

icarusDozens Demigod
 Joined: 11 Apr 2006, 12:29
Now I am not sure we'll get better performance. There isn't much factoring and maybe looking up values in tables of hundreds or thousands of values is less efficient. Many of the tests are GCD. But will test it out. The code is over 100; lines in wolfram. There are dual routines for HTML and wolfram and maybe it might be combinable into a single routine with conversion to the other.

wendy.kriegerDozens Demigod
 Joined: 11 Jul 2012, 09:19
Omega2 (which is i suppose, checking for the likes of 49 in base 120), is not really that useful, because it supposes that you have to divide through by alpha and then apply the same test again, eg
2.54 => 56, 2.54 divide by 7 gives 42.
But it gets a little strained when one deals with very large numbers.
Note, eg that Alpha2 and Omega2 don't include the powers included in the Alpha and Omega, so in base 50, the test for 49 is the same as 7, is simply add the places.
It must be kept in mind, that while 11 divides 120's Alpha, that there is a test at digit level for 11 ( ++) over the staffpairs, eg 4.48 ++ = 0+8,  4+4. It's kind of like an alpha in decimal and an omega in dozenal.
Generally, Omega tests are very useful for parallel calculation checks, and where omega has divisors, or is small enough to deal directly, then this can be blazingly fast. (I often do mod7 checks on twelftycalculations).
For general primes, one can make use of numbers adjacent to multiples of the base, so for 13 and 37 together, note 4.01 is 13*37. When one does a castout division, one does not have to recall the full carry, eg
9.85.10.16 9.85  8.02 gives 1.83,10  1.80,50 guves 3.10  50 = 2.80. Then 2.80.16 less 2.80.80 gives 1680 = 64, and this is the common remainder 13 and 73,
6564 = 1, modulo 13, and 7464 gives 10 modulo 37.
The larger richness (or ripple) regulars present a problem too. One notes that the success or failure depends on the codivisors of the relevant primes. For example, 120 = 8*15, and one can treat the last two places of twelfty as a pair of base15 digits, eg 10.16 = 10.2 (b15, divide 16 by 8), and 102 is a multiple of 8, thus this number is a multiple of 64.
2.54 => 56, 2.54 divide by 7 gives 42.
But it gets a little strained when one deals with very large numbers.
Note, eg that Alpha2 and Omega2 don't include the powers included in the Alpha and Omega, so in base 50, the test for 49 is the same as 7, is simply add the places.
It must be kept in mind, that while 11 divides 120's Alpha, that there is a test at digit level for 11 ( ++) over the staffpairs, eg 4.48 ++ = 0+8,  4+4. It's kind of like an alpha in decimal and an omega in dozenal.
Generally, Omega tests are very useful for parallel calculation checks, and where omega has divisors, or is small enough to deal directly, then this can be blazingly fast. (I often do mod7 checks on twelftycalculations).
For general primes, one can make use of numbers adjacent to multiples of the base, so for 13 and 37 together, note 4.01 is 13*37. When one does a castout division, one does not have to recall the full carry, eg
9.85.10.16 9.85  8.02 gives 1.83,10  1.80,50 guves 3.10  50 = 2.80. Then 2.80.16 less 2.80.80 gives 1680 = 64, and this is the common remainder 13 and 73,
6564 = 1, modulo 13, and 7464 gives 10 modulo 37.
The larger richness (or ripple) regulars present a problem too. One notes that the success or failure depends on the codivisors of the relevant primes. For example, 120 = 8*15, and one can treat the last two places of twelfty as a pair of base15 digits, eg 10.16 = 10.2 (b15, divide 16 by 8), and 102 is a multiple of 8, thus this number is a multiple of 64.
Twelfty is 120 dec, as 12 decades. V is teen, the '10' digit, E is elef, the '11' digit. A place is occupied by two staves (digits).
Digits group into 2's and 4's, and . , are comma points, : is the radix.
Numbers writen with a single point, in twelfty, like 5.3, means 5 dozen and 3. It is common to push 63 into 5.3 and viki verka.
Exponents (in dec): E = 10^x, Dx=12^x, H=120^x, regardless of base the numbers are in.
Digits group into 2's and 4's, and . , are comma points, : is the radix.
Numbers writen with a single point, in twelfty, like 5.3, means 5 dozen and 3. It is common to push 63 into 5.3 and viki verka.
Exponents (in dec): E = 10^x, Dx=12^x, H=120^x, regardless of base the numbers are in.

icarusDozens Demigod
 Joined: 11 Apr 2006, 12:29
Indeed omega2 is not very useful but was a byproduct of panning for semicoprimes (products of a regular g and coprime t) that have a coprime part t that is a factor of (b^4  1). Now that I might presort the numbers n in the range under study (often that of b itself) the need to handle omega2 is gone. I could color omega 2 "hh" or "opaque semicoprime". This would free up a "slot" in HTML.
I am going to rebuild the function with presorting in Associations and MemberQ and generating Wolfram only, mapping an HTML encode function across the InputForm of the Wolfram output to get HTML. The latter should prove to be a mass ReplaceAll routine for the most part. The function will be association rather than list based.
I have a way around the browser CSS ceiling.
Q: in an expanded panning for arithmetic relationships, which should we include? Should I retain omega2 and add alpha2^2omega etc. as Oschkar suggests?
Now we recognize richness levels in the sixth bit. I didn't program "practicality" because it is more complicated than richness. But I have a good guideline. Using the 25 present in the range folded regular test for decimal 8, I look for the closest number regular g to b near 30. If a number, even a divisor, has more than g values in the table, it is impractical. If we can range fold to get the value at or below, then it is semipractical. This renders the evenness test impractical in bases greater than 60, unless it is perceived as 6on10. The complexity of practicality is the reason why I just us d the guidelines.
The richness guidelines are as follows:
Richness 0 â€” the unit, 1 in any base, as it is the empty product and is a coprime divisor. Purple.
Richness 1 â€” divisors, red.
Richness 2 â€” nondivisor regular with fractions terminating in 2 digits, div tests with 2 digits, orange.
Richness 3 â€” nondivisor regular with fractions terminating in 3 digits, div tests with 3 digits, peach.
Richness > 3 â€” light peach.
The threshold for semicoprimes is richness=2 for their regular part.
The settings could be changed dynamically in the new function.
First I need to test efficiency.
I am going to rebuild the function with presorting in Associations and MemberQ and generating Wolfram only, mapping an HTML encode function across the InputForm of the Wolfram output to get HTML. The latter should prove to be a mass ReplaceAll routine for the most part. The function will be association rather than list based.
I have a way around the browser CSS ceiling.
Q: in an expanded panning for arithmetic relationships, which should we include? Should I retain omega2 and add alpha2^2omega etc. as Oschkar suggests?
Now we recognize richness levels in the sixth bit. I didn't program "practicality" because it is more complicated than richness. But I have a good guideline. Using the 25 present in the range folded regular test for decimal 8, I look for the closest number regular g to b near 30. If a number, even a divisor, has more than g values in the table, it is impractical. If we can range fold to get the value at or below, then it is semipractical. This renders the evenness test impractical in bases greater than 60, unless it is perceived as 6on10. The complexity of practicality is the reason why I just us d the guidelines.
The richness guidelines are as follows:
Richness 0 â€” the unit, 1 in any base, as it is the empty product and is a coprime divisor. Purple.
Richness 1 â€” divisors, red.
Richness 2 â€” nondivisor regular with fractions terminating in 2 digits, div tests with 2 digits, orange.
Richness 3 â€” nondivisor regular with fractions terminating in 3 digits, div tests with 3 digits, peach.
Richness > 3 â€” light peach.
The threshold for semicoprimes is richness=2 for their regular part.
The settings could be changed dynamically in the new function.
First I need to test efficiency.

OschkarDozens Disciple
 Joined: 19 Nov 2011, 01:07
{a}icarus @ Jan 20 2016, 12:47 PM wrote: Indeed omega2 is not very useful but was a byproduct of panning for semicoprimes (products of a regular g and coprime t) that have a coprime part t that is a factor of (b^4  1). Now that I might presort the numbers n in the range under study (often that of b itself) the need to handle omega2 is gone. I could color omega 2 "hh" or "opaque semicoprime". This would free up a "slot" in HTML.
However, omega2 is functionally identical to alphaomega composites in even bases, and adds just one more binary power in odd bases. Iâ€™m sure that we will still need a colour to represent alphaomega composites, which wouldnâ€™t really free up a slot anyway, because things like 21 in octal and 15 in undecimal still need to be represented somehow, and omega2 is practically the same test.
Alpha2, on the other hand, is a totally new test that is only practical for a few bases, probably {7, 8, 12, 13, 17, 18} to test for 5, {5, 8, 18, 21} to test for 13, and {13, 21, 30} to test for 17, assuming a very generous memory limit of 60 instead of your 30, since, because the alpha2 test isnâ€™t at all obvious, I assume that, except for users of bases {7, 8, 12, 13} testing for 5, they will be closer to the field of advanced mnemonists and mental calculators.

I have some thoughts on range folding, somewhat in response to Wendyâ€™s comment about testing for 64 in base 120. Icarus, if this is too off topic, can you split the following over to another thread?
I use this decimal test for 16 frequently: A number is divisible by 16 if the sum of the hundreds places and a quarter of the units places is divisible by 4.
a*100 + b = a*4 + b (mod 16)
Since b will be a multiple of 4, we can divide everything by 4, to get a + b/4 (mod 4).
16777216: 72 + 16/4 = 72 + 4 = 76, which is divisible by 4
My test for 32 is something like: A number is divisible by 32 if the sum of the hundreds places and a quarter of the units places, plus 4 if the myriads digit is odd is divisible by 8. I don't really use it, though; I usually just halve the number and then test for 16.
a*10000 + b*100 + c = a*16 + b*4 + c (mod 32)
We divide everything by 4 again, to get a*4 + b + c/4 (mod 8)
a*4 (mod 8) will be 0 if a is even, and 4 if a is odd.
16777216: 72 + 16/4 + 4 = 72 + 4 + 4 = 80, which is divisible by 8
This test doesnâ€™t extend easily to 64, though:
a*10000 + b*100 + c = a*16 + b*36 + c (mod 64)
Dividing by 4, we get a*4 + b*9 + c/4 (mod 16)
I could try out something like "4 times the myriads twistaff, plus 9 times the hundreds twistaff, plus a quarter of the units places", but that seems more complicated than what even I can bear, and I actually use the cubeneighbour tests for decimal 7, 13 and 27... I'll stick with halving twice and then testing for 16.
{6}
Let me see how senary might handle large binary powers. In senary, you can actually memorise all 43 threedigit multiples of 12, so we might rangefold on 12 instead of 4. The problem is that 1000 isn't within 12 steps to a multiple of a large power of 2.
Testing for 24 is simple enough, just the standard rangefold test for one more binary power.
The test for 52 is similar to the decimal test for 24, but subtractive instead of additive:
a*1000 + b = a*12 + b (mod 52)
Dividing by 12, we get a + b/12 (mod 4)
A number is divisible by 52 if the difference between an eighth of its last three digits and the two preceding digits is divisible by 4.
The test for 144 (d64) is weird, because the residues run out just like in decimal, but it might just be workable, because the residue of the "thousands" is 3, almost trivial to multiply by in senary.
a*100000 + b*1000 + c = a*52 + b*40 + c (mod 144)
Dividing by 12, we get a*4 + b*3 * c (mod 12)
a*4 (mod 12) is 0 if a is even and 4 if a is odd.
A number is divisible by 144 if the sum of an eighth of its last three digits and three times the two preceding digits, plus 4 if the sixthtolast digit is odd, is divisible by 12.
{a}
It is true, however, that 120 offers a very good range of 7 binary powers, 2 ternary powers (maybe 3, if the hundreds residue of either 4 or 5 doesnâ€™t turn out to be that bad) and 3 quinary powers.