Abstract:In previous studies, incomplete extraction of logging curve features and simpler model construction resulted in limited porosity prediction accuracy. In order to improve the prediction accuracy,the self-encoder, long and short-term memory network and the Attention mechanism were combined to construct the AE-LSTM-AT (auto-encoder-long short-term memory network-attention mechanism)model. the AE (self-encoder) unifies the feature distributions of the source domain data and the target domain data into the same space, in order to reduce the interference of the magnitude changes on the model due to the differences in data distribution,the modified LSTM(long short-term memory network) reduces the number of parameters while enhances the feature impact of distant time steps and reduces information pollution,and the introduction of the Attention mechanism dynamically calculates the attention weight of each time step, thus focusing on the key features more accurately and improving the performance and performance of the model in processing sequence data. a control group including MLP(multilayer perceptron machine) and LSTM was set up, and four sets of comparison experiments were conducted. It is proved that the model structure of has superior results in the problems of long-term prediction and cross-domain prediction.