Boolean algebra was extremely popular at the dawn of the digital era, and was part of the syllabus in most electronics courses about thirty to forty years ago.
The basic philosophy behind this was to reduce the number of discrete logic gates required to design and implement a required set of input and output logic functions. This same principle also applied to the even more ancient art of relay logic design.
These days nobody would try to design a circuit board with several thousand discrete logic gates, and hundreds of individual integrated circuits. A single chip programmable logic array, would do exactly the same thing, or even more flexible, a single chip micro controller solution.
Forty years ago, all you could buy were logic gates and flip flops, and that is all you had to work with. So a large part of economy of design was in reducing the total gate and package count. The microprocessor changed all that.
Today nobody really cares any more about reducing gate count. The chip you use may contain several million logic gates, and your implementation may only actually use a few dozen of them. If the chip costs only two dollars trying to reduce the number of gates actually used even further, is a complete waste of time.
So boolean algebra has now largely been made obsolete by software. A programmable logic controller would be the modern way to build a logic state machine for industrial control. It would have such a vast internal capacity for logic states, that rigorously minimising a design is simply no longer required to fit it all in.
In fact, a big inefficient intuitively written logic program may be a lot easier to follow and modify, than something that uses a lot of very cunning tricks that others may have great difficulty in later unraveling.