HooVorNet: The Hints of Ozone Vortal Network

I’ve decided to call my segmentation network HooVorNet, and I’ll share some its key unique features. HooVorNet employs feature generator blocks that split filters among several calculation paths and then recombine with concatenation. HooVorNet’s feature generator blocks are distinct from other feature concentrators that use a similar method such as Google’s Inception Module because HooVorNet uses shallow and deep auto-encoders in each of the split paths, and incorporates a skip connection by concatenation.

Using the HooVorNet Feature Generator block, the network simultaneously gains the benefits of normal convolutional feature examination and the incredibly deep inference of chained auto-encoders. This block encodes shallow swept local features (1×1,3×3, and 5×5) and deep strided features (16×16).

The network employs special and undisclosed deep feature generator blocks just ahead of the output that are a more advanced version of the Feature Generator Blocks that provide the same benefits while keeping parameter and FLOP counts of the network at a minimum. For this particular embodiment of HooVorNet, the full model has only 9 million parameters and requires 4 billion floating point operations.

Leave a Reply