My Profile Photo

Coder Spirit


I'm Andrés Correa Casablanca. I work as a software engineer, but I have a scientist in my heart. I love mathematics, algorithms, AI, and machine learning.


Rust 2021

First of all, a disclaimer: I’m quite new in the Rust community. Moreover, I’m not even active in the Rust community. Not because of any particular reason, somehow I usually don’t get involved into any sort of community. I wish I did, but I lack the patience, focus and social skills.

Having said that, I’m learning and using the technology, and I usually read the news and articles written by other people involved in the project. I still consider myself a newbie, so I don’t expect that anyone takes my opinion seriously on how I wish Rust should evolve this next year. Not even myself!

The Rust Foundation

This is something I was wishing for, and I’m happy it’s gonna happen soon, but there are some details I consider important.

I’m not the first one to say this: I think the Rust Foundation should be based in the European Union and not in the USA. With all its contradictions (and dark history), Europe still feels like a more open minded, tolerant, diverse and safer place to be. In Europe there’s also more respect for human rights and citizens’ privacy. I want the Rust community to fully embrace these values.

There’s another reason for me beyond the “values” point. I want the Rust community to be as big and open as possible, and this includes the “enemy” too. USA seems to have a tradition on imposing bans and economic sanctions on their “enemies”, and we have seen this attitude towards China recently.

I won’t judge if these measures were reasonable or not, but I honestly fear that they can negatively impact the Rust community as well, fragmenting it, or making almost impossible for developers living in China to join it (or to just stay if they were already in).

Europe is not neutral, but at least is more neutral than the USA.

Learning materials & certifications

The Rust project has plenty of useful official documentation:

Besides the “books”, there are some basic courses out there, and even some more advanced materials, like the course on embedded programing with Rust, by Ferrous Systems.

But I still miss some “general” courses that help developers to dive deep into more advanced topics, and that also help them to assess their own knowledge and ultimately obtain a proof in the form of a certification to improve their hireability.

I see this as a potential source of income for the new Rust Foundation. This is important not just from the financial perspective, but also when we consider the need for some degree of independence from very big corporations.

Rust has plenty of powerful tools to ensure good code quality, but there’s need for more tooling if we want to spread its usage. Most online CI/CD platforms don’t have yet explicit support for Rust.

We can see it when Github is not able to show us the dependencies trees, or when it doesn’t tell us about known security problems in our dependencies (while it’s able to do that for other tech stacks).

In the case of CI/CD pipelines, I only saw explicit support for Rust in TravisCI (which is useful in case you want to use complex build matrices).

Same applies to many online static analysis services, whether they are focused on code quality, or security. Among dozens of those services, I didn’t see a single one providing explicit support for Rust.

I think we should convince these companies to consider giving explicit support to Rust developers in their products. Here is my first small contribution, showing them that there’s real demand for it.

“Safe” programming

I started learning Rust in part because of its “safety” properties. Rust really shines on this aspect, but I think it could be even better, and go beyond “just” memory safety. If there’s anything that allows us to reduce the probability of panic at runtime due to logic errors, it can be worth considering it.

One case that comes to my mind is the Range type. As an amateur I noticed some inconveniences, but I lacked a deep understanding to disect them and describe the real problems behind these sensations. And then I found this interesting article on how we could improve it.

Another thing that interests me is range-checked arithmetic (something in the lines of what Ada provides, with things like bounded integer types). I’m truly hopeful that “const generics” will help to improve this aspect, although that’s only a starting point.

“New” ideas for safe programming

I’m not claiming that what I’m gonna propose is new (or even worth pursuing!). The research on programming languages has generated tons of very interesting proposals that usually take years, if not decades, to be introduced into the industry. So I guess someone already had this “same” idea, and probably expressed it in a better way as well.

Before explaining that, let’s check how we would create some types that add extra information about some properties of scalar values.

The “newtype” pattern is probably the best way to start:

struct Prime(i32);

impl Prime {
    pub fn from(x: i32) -> Option<Prime> {
        if is_prime(x) {
            Some(Prime(x))
        } else {
            None
        }
    }
}

With this approach we cannot easily combine the “primality check” with other checks inside the information carried by the type, unless we create a specific type for each combination of properties that we need.

What I have in mind is some sort of “runtime type tags”, that can be “attached” to values without conflicting with each other.

// This attribute would forbid having a parametric enum, and also would force
// the compiler to generate an error if there is no associated "tagger".
#[type_tag(for_scalar_type = i32)]
enum Primality {
    Prime,
    Composite
}

impl Primality {
    // Type-Tagger functions should be "const", and consume only inmutable
    // references.
    #[type_tagger(for_scalar_type = i32)]
    pub const fn tagger(value: &i32) -> Primality {
        // ...
    }
}

#[type_tag(for_scalar_type = i32)]
enum BoundedNumber<const Min: i32, const Max: i32> {
    Bounded,
    BelowBounds,
    AboveBounds
}

// I assume that a big part of const generics is implemented
impl BoundedNumber<const Min: i32, const Max: i32> {
    #[type_tagger(for_scalar_type = i32])]
    pub const fn tagger(value: &i32) -> BoundedNumber<Min, Max> {
        if value < Min {
            BoundedNumber<Min, Max>::BelowBounds
        } else if value >= Min && value <= Max {
            BoundedNumber<Min, Max>::Bounded
        } else {
            BoundedNumber<Min, Max>::AboveBounds
        }
    }
}

// The proposed syntax is just temptative. I'm not sure what would be the best
// way to indicate that a value has been "tagged", but it should support
// multiple tags for a single value.
fn returns_prime_numbers() -> i32 + TypeTagged<Primality::Prime> {
    // ...
}

fn only_accepts_prime_numbers(x: i32 + TypeTagged<Primality::Prime>) {
    // ...
}

fn only_accepts_prime_numbers_between_0_and_10(
    x: i32
       + TypeTagged<Primality::Prime>
       + TypeTagged<BoundedNumber<0, 10>::Bounded>
) {
    // ...
}

fn only_accepts_numbers_outside_0_10_interval(
    x: i32 + TypeTagged<!BoundedNumber<0, 10>::Bounded>
) {
    // ...
}

fn example1() {
    let x = 23;

    // In this case, it's possible to compute the tag at compilation time
    // because the tagger is const fn, and because x value is known.
    attach_const_type_tag!(Primality, x);
    attach_const_type_tag!(BoundedNumber<10, 30>, x);

    // This should work as we know the type tag at compile time
    only_accepts_prime_numbers(x);
}

fn example2(x: int32) {
    // We can add multiple tags to an existing value without mutating it.
    // These macro calls will implicitly call the tagger methods, and save the
    // generated type tags in the stack.
    attach_runtime_type_tag!(Primality, x);
    attach_runtime_type_tag!(Parity, x);

    {
        // This runtime "type tag" will be dropped from the stack once we leave
        // this context
        attach_runtime_type_tag!(BoundedNumber<0, 10>, x);
    }

    // This line would generate an compile-time error, since we don't know the
    // associated type tags at compile time.
    only_accepts_prime_numbers(x);

    // This should work. The type tags would be read from the stack and used to
    // choose between branches in this match expression.
    // Notice that macros are not enough to implement this, some support at the
    // language level is required in order to ensure that the type signature of
    // the called function is respected.
    match get_runtime_type_tag!(Primality, x) {
        Primality::Prime => only_accepts_prime_numbers(x),
        Primality::Composite => {}
    }

    // Assuming that 'Squarity' was defined, this should generate a compile-time
    // error, because we didn't add a Squarity type tag to x in this context.
    match get_runtime_type_tag!(Squarity, x) {
        Squarity::Square => println!("{}", sqrt(x)),
        Squarity::NoSquare => {}
    }
}

To be honest, I’m not even sure if something like this is a good idea. The main point is to make the type system more expressive, allowing to define constraints on scalar values in a more flexible way.

Conclusions

From all the mentioned stuff, what worries me most is how the Rust community will manage to evolve in an increasingly fragmented world. Economic conflicts are too common, and new wars would not surprise the vast majority of us.

We have to do our best to create a welcoming, diverse and respectful space that helps people to create new bonds and keep them over time.

comments powered by Disqus