I have an idea that because there is no a single right way to convert a 2D matrix to a long vector I would assume its a complex process.
That is. I think its possible to take advantage of the model optimizing capability of machine learning libraries to generate an index vector as a conversion tool.
The length of the index select vector need not be of exact length of the data. It could be smaller, exact or bigger. I guess it could sample problem areas much more than others. Generating duplicates of data in the same long vector.
So whenever there is a problem assume a network model. Like which data come clustered together in the vector.