home →  Big Psi

Reluctance in arithmetic

A reluctant operator is an operator that postpones its operation until later. So if a standard operator would perform its function immediately, the reluctant version of that operator waits until the next operator or operators have performed their function.
The whole idea here is that we can learn to cope without nested brackets.

For example in an arithmetical system for Big numbers that resolves numbers from right to left, the operators will be right to left associative by default, which is fine for the way we think of powers.

Standard power a^b^c = a^(b^c)
Reluctant power a^b♦c = (a^b)^c = a^(b*c) = a^b*c
This fancy reluctant power may just as well be replaced by a multiplication (which is dominant over the left power in my system which reads from right to left). But the next example really needs a reluctant operator.
Reluctant power a+b♦c = (a+b)^c

I hope you can feel the problem that is underway: some more complicated equations cannot be expressed by such a simple system, not even by intelligent rearrangement of operands.
Eg. the translation of a+((b+c)*(d+e))^f to a format without brackets raises serious issues:
If * is reluctant and if it is reluctant 'all the way', it will allow (a+b+c),
but if the a is shifted right its + operator must be made reluctant too and the equation becomes:
b+c♥d+e♦f♣a but then the order of what must happen after the two standard additions + isn't clear anymore.
Either way a+((b+c)*(d+e))^(f+g) or (a+(b+c)*(d+e))^(f+g) goes wrong in any simple reluctant system I can propose.

Some inspiration from Regular Expressions (RegExp), a system to search text with tiny patterns. In RegExp the property of reluctance further specifies the function of its primitive arithmetical quantifiers.

RegExp quantifyers are greedy by default (eating their way from left to right trying to match a string of maximum length). Then there are two supplementing forms of special quantifyers: the lazy quantifyers (which try to match a minimum number of characters) and in Java there are possessive quantifiers which do not let go, even when the overall match would otherwise succeed.
Quantifiers are specified as lazy (also known as non-greedy, reluctant, minimal, ungreedy) by putting a question mark ? after the quantifier.

Lazy or reluctant quantifiers in RegExp are not the same as the reluctant operators I envision for an arithmetical Big number system, because the lazy quantifier does not postpone its function, it does try to perform a match. But it matches as few characters as possible until the match as a whole fails, only then it accepts one more character, etc.
Eg. Q+? will capture at least one character Q or else the Java Matcher returns false.

Reluctant quantifiers are constructed with X to specify a character sequence, next ? * + etc. comes the quantifier, the extra ? is for reluctance:

X??       X, once or not at all
X*?       X, zero or more times
X+?       X, one or more times
X{n}?     X, exactly n times
X{n,}?    X, at least n times
X{n,m}?   X, at least n but not more than m times

It's not easy to think of a way to use that fourth option. I give up.

I have a better idea. In arithmetical context reluctance can be specified by a number. If an operator feels reluctant to take on the operation, then it must specify when it will. For example:

a+((b+c)*(d+e))^(f+g) = a+b+c*1d+e^2f+g
(a+(b+c)*(d+e))^(f+g) = a+b+c*1d+e^3f+g

The numbers in red superscript specify what I call the reluctance range of the operators that follow left. In both examples above the addition b+c isn't the responsibility of ^ because * declares its reluctance and counts down its 1 after the addition is done.

The nice thing about having numbers for the reluctance range of an operator is that a higher number also indicates an increased effect of its operation. At least when the equation doesn't have negative numbers and when the operator is relatively big, the result of an operation with a positive reluctance will be larger than that of a standard zero-reluctance right-to-left operation.
Greedy negative reluctances come to my mind that would allow an operator to take precedence over an operator on its right side. Something unheard of. It requires a scan from left to right before attempting any operation. We must talk about negative numbers later.

Composite reluctant operators are a more exotic possibility. They postpone their operation for one part (of a later second operand) and apply their operation to the remaining part.
A few experiments now, where the fractional number (for which 0<n<b) defines a composite reluctant operator:

a^b+n/bc = (a^n) + a^((b-n)+c)
a^b*n/bc = (a^n) * a^((b-n)*c) = a^(b*c) = a^b*0c
a*b+n/bc = (a*n) + a*((b-n)+c) = a*(b+c) = a*b+0c
a+b*n/bc = (a+n)*c + (b-n)*c = (a+b)*c = a+b*1c
a*b^n/bc = (a*n)^c * (b-n)^c

Fractional reluctant operators were not exactly what I was looking for, but the normal reluctant operators do help computerized arithmetical systems to avoid brackets. Perfectly, so it seems.

to be continued