Multistep prediction of temperature and humidity in poultry houses based on the GFF-transformer model

Hengyi JI, Guanghui TENG

Front. Agr. Sci. Eng. ››

PDF(3830 KB)
Front. Agr. Sci. Eng. All Journals
PDF(3830 KB)
Front. Agr. Sci. Eng. ›› DOI: 10.15302/J-FASE-2025603
RESEARCH ARTICLE

Multistep prediction of temperature and humidity in poultry houses based on the GFF-transformer model

Author information +
History +

Highlights

● A module was proposed for extracting multiscale temporal features.

● A multistep model was constructed for predicting temperature and humidity.

R 2 of temperature prediction is higher than 0.88.

R 2 of humidity prediction is higher than 0.86.

Abstract

Accurate predictions of future temperature and humidity in poultry houses are essential for environmental control strategies. Given the complex dynamics of environmental changes, a gated feature fusion (GFF) module was developed to capture multiscale temporal features. This module is integrated with a transformer model to develop a GFF-transformer model. The GFF-transformer model leverages environmental data from the past 24 h (temperature, humidity, CO2 and static air pressure) to predict the temperature and humidity of the next 6, 12, 18 or 24 h. Compared to the long short-term memory method, gated recurrent unit and transformer models, the GFF-transformer model exhibits improved performance. For prediction intervals of 6, 12, 18 and 24 h, the model achieves R2 values between 0.88 and 0.92 for temperature in the range of 20.1 and 31.5 °C, with mean absolute error (MAE) ranging from 0.48 to 0.62 °C, and root mean square error (RMSE) ranging from 0.68 to 0.85 °C. For humidity in the range of 18% and 97%, the R2 ranges of the model from 0.86 to 0.94, with MAE between 2.9% and 4.7%, and RMSE between 4.3% and 6.4%. Overall, the proposed GFF-transformer model provides a highly accurate and low-error solution for multistep temperature and humidity predictions in poultry houses, offering an effective tool for optimizing environmental control strategies.

Graphical abstract

Keywords

Chicken / humidity / prediction / temperature / transformation

Cite this article

Download citation ▾
Hengyi JI, Guanghui TENG. Multistep prediction of temperature and humidity in poultry houses based on the GFF-transformer model. Front. Agr. Sci. Eng., https://doi.org/10.15302/J-FASE-2025603

1 Introduction

The China’s poultry industry is moving toward standardized and large-scale farming[1]. In large-scale poultry operations, high-yield breeds are commonly used, which are particularly sensitive to environmental stress[2]. Among the factors that influence poultry house environments, temperature and humidity are the most critical. Temperature directly affects bird health, as both excessive heat and cold can slow growth, reduce feed efficiency, and lower egg production[3,4]. Similarly, improper humidity levels can deteriorate air quality and thermal comfort, increasing the risk of pathogen transmission[5,6]. As a result, maintaining an optimal temperature and humidity environment is essential for safeguarding poultry health and enhancing production efficiency.
Advancements in sensor technology now enable real-time monitoring of environmental factors in poultry houses[7]. Currently, most environmental control systems in poultry houses rely on comparing real-time temperature data from sensors with preset thresholds to determine when to activate control equipment[8]. However, there is often a delayed response from the equipment to changes in temperature and humidity[9]. This delay can negatively impact the microclimate in the poultry house, potentially affecting chicken growth. Consequently, feedback-based environmental control methods, which adjust based on sensor data, may not respond quickly enough to external environmental changes, leading to decreased control efficiency. To enhance the accuracy and responsiveness of environmental control systems, it is crucial to predict temperature and humidity in advance. This allows for optimization of control strategies and helps maintain a stable and suitable microclimate in poultry houses[10].
Temperature and humidity in poultry houses are influenced by multiple factors, exhibiting nonlinear and time-delay characteristics with rich time-series patterns over both short and long periods. In poultry farming, researchers have explored time-series predictions of environmental factors. The research generally can be categorized into two categories: mathematical models[11,12] and machine learning models[13]. Wang et al.[11,12] developed a discrete model that integrates a time period group (TPG), a group buffer rolling mechanism, and TPG factors to predict future temperatures using limited data. However, mathematical models are typically used for sparse data, small samples, and insufficient information[11]. With advances in sensor technology, collecting large amounts of time-series data on environmental factors in poultry houses has become easier, making mathematical models less suitable for large-scale time-series prediction. Also, machine learning models can automatically extract and learn complex patterns from time-series data, enabling accurate predictions of future conditions[14]. Xu et al.[13] used an ensemble empirical mode decomposition-gated recurrent unit model to predict future ammonia concentrations based on past data of ammonia levels, temperature and humidity. Current research on predicting temperature and humidity in poultry houses is relatively limited. Therefore, developing a multistep prediction model capable of handling multifeature and large-scale input data for poultry house temperature and humidity is a critical area of study.
In recent years, deep learning models, including recurrent neural network (RNN), long short-term memory (LSTM), gated recurrent unit (GRU) and transformer models, have seen rapid development and widespread use in multistep time-series prediction[15]. For example, Dai et al.[16] applied an RNN model to predict future indoor PM2.5 concentrations based on historical data. Sekertekin et al.[17] combined an adaptive neuro-fuzzy inference system with an LSTM model to predict hourly and daily temperatures using a annual data. Similarly, He et al.[18] used a GRU model to predict the minimum future temperature in a greenhouse based on local weather data. Despite their effectiveness, these methods tend to be computationally intensive and often fail to capture sufficient global information. In contrast, the self-attention mechanism of the transformer model allows it to model any part of the input, regardless of temporal period, giving it greater potential for handling long-term information and capturing long-term dependencies[19]. Wu et al.[20] introduced a transformer-based time-series prediction model and compared it with an LSTM model on an influenza outbreak data set. Their results showed that while the transformer outperformed the LSTM on univariate data sets, the LSTM performed better on multivariate data sets. This data indicate that the transformer model may lack the ability to fully capture interactions between multiple features.
To overcome the limitations of the transformer architecture, recent research has explored combining transformer with convolutional neural network (CNN) to handle tasks such as time-series prediction[21]. Chen et al.[22] developed a CNN-transformer hybrid model for predicting ozone concentration, showing that it performs better than LSTM and transformer models. This hybrid model aims to integrate the strengths of CNNs in learning local and correlated features with the ability of the transformer to capture long-term dependencies, thereby enhancing overall prediction performance. For this study, we proposed a CNN module specifically designed for multivariate time-series prediction. This module incorporates multiple convolutional layers with varying receptive field sizes, allowing it to capture both short-term and long-term patterns. This design helps the model to better process the multiscale features within the data, leading to improved performance in handling complex time-series data.
The primary objective of this study was to develop and evaluate a model for multistep prediction of temperature and humidity in poultry houses. Given the complexity of temperature and humidity variations, we propose a gated feature fusion (GFF) module to extract multiscale temporal features and integrate it with a transformer architecture, resulting in the GFF-transformer model. This model used the past 24 h of environmental data (temperature, humidity, CO2 and static air pressure) to predict temperature and humidity for the next 6, 12, 18 or 24 h. When compared to LSTM, GRU and transformer models, the GFF-transformer demonstrated improved accuracy in multistep predictions of temperature and humidity. This study offers a valuable technological foundation for the precise and intelligent management of poultry house environments in the future.

2 Methodology

2.1 Data collection

The experiment took place between 20 March and 16 December 2023, in an egg-laying poultry house in Haiyang City, Shandong Province, China. The house was 86 m long by 12.5 m wide and used a four-tier stacked cage system with four rows of cages. About 26,000 laying hens, aged 14–22 months, were housed in the facility during the trial. A conveyor belt beneath the cages collected manure, which was removed daily. The hens had unrestricted access to feed and water. The poultry house was equipped with a cooling and ventilation system, consisting of 16 fans (127 cm in diameter) on the gable wall, two cross-flow fans (91 cm in diameter) on each sidewall, one cooling pad on each sidewall, and 31 side windows. Environmental conditions were monitored using three temperature sensors, one humidity sensor, one CO2 sensor, and one static air pressure sensor. Fig.1 shows the locations of various sensors placed in the chicken house. The temperature sensors were positioned at the front, middle, and rear of the house, while the other sensors were placed centrally. Data from the sensors were recorded every 5 min and uploaded to a cloud platform using internet of things technology for storage and analysis.
Fig.1 Location of different sensors.

Full size|PPT slide

2.2 Data preprocessing

During data collection and transmission, sensor data losses occurred, and linear interpolation was used to fill the missing values. Three temperature sensors were installed in the house, and their averaged data being the overall temperature of the facility. Following the preprocessing method used by Wang et al.[12], all environmental data were aggregated into hourly averages. Fig.2 illustrates the trends of the four environmental variables. Temperature and humidity has similar patterns, peaking in the summer and remaining relatively low during other seasons. In contrast, CO2 concentrations had an opposite trend, decreasing from spring to summer and then rising from summer to winter. This inverse relationship is related to the ventilation strategy in the poultry house, where maximum ventilation during the hot and humid summer months reduces CO2 levels. These findings show that the variation of a single environmental factor can be influenced by other factors. A multivariate data set allows the model to account for the interactions among all environmental variables, enabling it to capture more complex dynamic relationships and thereby improve prediction performance. Air pressure changes were less distinct, with a slight downward trend in summer, likely due to increased ventilation. In October, air pressure fluctuations became more pronounced, followed by a drop to levels below those observed in summer, which may be attributed to a decline in sensor accuracy. However, a rising trend in air pressure was observed in November, reflecting a gradual decrease in ventilation as winter approached.
Fig.2 The trend of various environmental data (temperature, humidity, CO2, and air pressure).

Full size|PPT slide

In time-series prediction, a sliding window approach is typically used to structure sequential data into a usable data set. Fig.3 illustrates this method, using temperature prediction as an example. The input and output windows move synchronously across the entire time series with a fixed step size, continuing until the output window reaches the end of the sequence. For this study, we used four variables (temperature, humidity, CO2 and air pressure) collected over the previous 24 h as input features. The prediction targets were the temperature and humidity for the next 6, 12, 18 and 24 h. A sliding window with a step size of 1 h was applied. The data set was structured into 30-day cycles, with each cycle split sequentially into a training set and a test set at a 7:3 ratio. For the final cycle, which contained fewer than 30 days, the same ratio was maintained.
Fig.3 Illustration of the sliding window approach for predicting temperature.

Full size|PPT slide

2.3 GFF-transformer model

2.3.1 Transformer

Unlike the standard RNN, transformer relies entirely on the attention mechanism to capture global context within input and output sequences, offering a clear advantage in parallel processing. The transformer architecture is based on an encoder-decoder framework. In time-series prediction, the primary function of the encoder is to recognize and extract temporal patterns and features from the input sequence. Therefore, only the encoder is required to process the input, convert it into a hidden representation (embedding), and generate predictions. The encoder is composed of several identical layers, each containing two key components: a multi-head self-attention (MHSA) mechanism and a feed forward network (FFN).
MHSA is a key component of the transformer model and serves as an advanced form of the standard attention mechanism. Fig.4 shows the architecture of the MHSA. MHSA splits the input into multiple segments, enabling the model to compute attention separately for each segment, thus identifying diverse relationships within the data. Given a sequence of input vectors, the attention mechanism produces three matrices, query (Q), key (K) and value (V), with the calculation process as follows.
Fig.4 Multi-head self-attention architecture.

Full size|PPT slide

Q=XWQ,K=XWK,V=XWV
where, X is the input matrix and WQ, WK and WV are the weight matrices. The attention weights are derived from the dot products of the Q, K and V. These weights are then used to compute a weighted sum of the value vectors, which generates the output. The attention weights are calculated as:
Attention(Q,K,V)=Softmax(QKTdk)V
where, dk is the number of columns in the Q and K. Standard attention mechanisms are limited in their ability to capture information from different subspaces. To overcome this limitation, as shown in Fig.4, the MHSA mechanism splits the process into multiple heads. Each head calculates attention weights independently. The outputs from all heads are then concatenated and transformed through a linear layer to produce the final result being calculated ass:
MultiHead(Q,K,V)=Concat(head1,,headh)WMHeadi=Attention(Qi,Ki,Vi)
where, WM is the weight matrix and h is the number of heads.
Each FFN in the encoder layer comprises two fully connected layers separated by a rectified linear unit (ReLU) activation function. The main purpose of FFN is to apply nonlinear transformations to the features, which increases the expressive power of the model.
Because the transformer model lacks recurrent or convolutional operations, it inherently cannot capture positional information within sequences. To overcome this, the transformer incorporates positional encoding (PE). This is an additive component that imparts relative or absolute positional information to the model for each element in the sequence. PEs are generally precomputed and added directly to the input embedding, being defined as:
PEpos,2i=sin(pos100002td)
PEpos,2i+1=cos(pos100002td)
where, pos is the position of a timestamp within the time series. Each timestamp is assigned a unique position: 1 for the first timestamp, 2 for the second and so forth. d is the dimensionality of PE, i is an index that ranges from 0 to d – 1. 2i is the even dimensions, 2i + 1 is the odd dimensions.

2.3.2 Gated feature fusion module

One-dimensional CNN (1D CNN) have gained wide application in time-series prediction due to their strong feature extraction capabilities and parallel processing advantages. However, when multiple 1D CNN layers are stacked in sequence, the model may become prone to overfitting as network depth increases. Inspired by InceptionNet[23] and LSTM[24], we propose a new architecture called the GFF module. This module uses a parallel multi-branch structure to promote feature sparsity and effectively extract multiscale features from time series. Fig.5 shows the architecture of the GFF module. The GFF module extracts features using three 1D CNN layers, each with a different number of groups (1, 2 and 4). A larger number of groups expands the receptive field, enabling the capture of broader spatiotemporal information. To selectively fuse these features, we introduce a gating mechanism similar to the gated units in LSTM networks. The gating mechanism operates in two steps: first, a sigmoid activation function generates weights for the CNN layer outputs, ranging from 0 to 1. These weights are then multiplied by the CNN output features. Each branch has its own gating unit, allowing the model to learn which information to retain or discard. Finally, the gated features are combined to achieve effective multiscale feature fusion. GFF module enhances the ability of the model to process multiscale data, improves generalization, and reduces the risk of overfitting.
Fig.5 Gated feature fusion module architecture.

Full size|PPT slide

2.3.3 GFF-transformer architecture

For this study, the temperature and humidity prediction model for poultry house was a GFF-transformer model, which integrates the GFF module with the transformer model. The architecture of this model is shown in Fig.6. Initially, the GFF module and transformer encoder module independently extract features from the preprocessed time-series data. These extracted features are then merged and reorganized into new feature representations, which are passed through a fully connected layer to generate predictions for future temperature and humidity. In the transformer encoder, two layers are used, each with 4 attention heads in the MHSA and a dimensionality of 128 for both the input and output of the FFN. In the GFF module, all grouped convolutions have a kernel size of 3, with 32 output channels. By extracting features in parallel, the model can capture information from various perspectives, resulting in richer final feature representations.
Fig.6 GFF-transformer architecture.

Full size|PPT slide

The GFF-transformer model consists of four main components: the input layer, the transformer module, the GFF module and the output layer.
Input layer The model receives a multidimensional time series consisting of temperature, humidity, CO2 and air pressure as input. This time series is represented as a three-dimensional matrix with the shape (B, T and F), where B is the number of samples input into the model, t = {T1, T2, ..., T24} (T is the time steps of the input sequence, with a total of 24 time steps) and F = {F1, F2, F3 and F4} (F1F4 represent temperature, humidity, CO2 and air pressure, respectively). The shape of the time series was set to (B, 24 and 4) for this study.
Transformer encoder module The time series with the shape (B, 24 and 4) is first passed through an embedding layer, converting it to a shape of (64, B and 128). After applying PE, it is reshaped to (24, B and 128) to align with the transformer architecture. The features extracted by two transformer encoder layers are then used, with the output being the feature of the final time step.
GFF module The time series with the shape (B, 24 and 4) is reshaped to (B, 4 and 24) to facilitate 1D group convolution operations. After feature extraction through the GFF module, the features are transformed from (B, 24 and 4) to (B and 96), and then passed through a fully connected layer, resulting in an output shape of (B and 128).
Output layer The two feature matrices with the shape (B and 128) are concatenated along the second dimension and passed through a fully connected layer to generate predictions for temperature or humidity. The final output has the shape (B and P), where P = {P1, P2, ..., Pi}, with Pi being the predicted temperature or humidity at the i-th future time step, up to 6, 12, 18 or 24 time steps ahead.

2.4 Model parameters and setting

This study used three popular time-series prediction models (LSTM, GRU and transformer) as benchmark models for comparison. Both the LSTM and GRU models consisted of 2 layers with 128 neurons each. The transformer model, like the GFF-transformer, used only the encoder portion with identical parameters.
The experiments were conducted on a Windows 10 (64-bit) system with 16GB of RAM and an NVIDIA RTX 1660 6GB GPU. The models were developed and evaluated using Python 3.8 and the PyTorch 1.8.1 deep learning framework. All models were trained with the following hyperparameters: a learning rate of 0.02, the AdamW optimizer, a batch size of 64, and 500 epochs. The loss function used for training was mean absolute error (MSE) calculated as:
Loss = MSE =N1i=1N(y^iyi)2
where, N is the total number of samples in the data set, ŷi is the predicted value of the i-th sample and yi is the true value of the i-th sample.

2.5 Models evaluation

The performance of the prediction models was evaluated using three metrics: mean absolute error (MAE), root mean square error (RMSE) and the coefficient of determination (R2). Lower values for all three metrics indicate improved predictive accuracy. The formulas for these metrics are:
MAE =N1i=1N(y^iyi)2
RMSE=N1i=1N(y^iyi)2
R2=1i=1N(y^iyi)2i=1N(y¯^iyi)2
where, y¯ is the mean value of all samples.
To evaluate the prediction performance of different models across various predict horizons, this study introduced temperature prediction error (εT) and humidity prediction error (εH) being defined as:
εT(i)=PT(i)AT(i)
εH(i)=PH(i)AH(i)
where, PT and AT are the predicted and actual temperature at time i, and PH and AH are the predicted and actual humidity at time i.

3 Results and discussion

3.1 Prediction results of temperature and humidity

Tab.1 shows the temperature prediction results for prediction horizons of 6, 12, 18 and 24 h using different models. The GFF-transformer model consistently outperformed others across all prediction horizons, followed by LSTM, GRU and transformer models. For predictions within the next 6 h, the GFF-transformer model achieved an R2 values 0.01, 0.02 and 0.07 higher than the LSTM, GRU and transformer models, respectively. Its MAE was lower by 0.05, 0.15 and 0.24 °C, and its RMSE was lower by 0.05, 0.17 and 0.27 °C compared to these models. For predicting temperature over the next 24 h, the GFF-transformer model had an R2 improvement of 0.04, 0.04 and 0.08 over the LSTM, GRU and transformer models. Its MAE decreases by 0.11, 0.11 and 0.17 °C, and its RMSE decreased by 0.1, 0.12 and 0.21 °C, respectively. This demonstrates the GFF-transformer model has a clear advantage in both short-term and long-term predictions. The transformer model exhibits the highest prediction errors at all prediction horizons, likely due to its reliance on self-attention mechanisms, which may not capture local and relevant features as effectively as the GFF-transformer model. Also, all models show decreased accuracy with longer predict horizons, highlighting the increased difficulty of long-term predictions.
Tab.1 Temperature prediction results using different prediction steps and models
Prediction horizons Metrics LSTM GRU Transformer GFF-transformer
6 R2 0.91 0.90 0.85 0.92
MAE (°C) 0.53 0.63 0.72 0.48
RMSE (°C) 0.73 0.85 0.95 0.68
12 R2 0.88 0.88 0.84 0.89
MAE (°C) 0.59 0.68 0.75 0.57
RMSE (°C) 0.82 0.92 0.98 0.79
18 R2 0.87 0.86 0.81 0.88
MAE (°C) 0.66 0.66 0.77 0.60
RMSE (°C) 0.88 0.90 1.03 0.83
24 R2 0.84 0.84 0.80 0.88
MAE (°C) 0.73 0.73 0.79 0.62
RMSE (°C) 0.95 0.97 1.06 0.85

Note: LSTM is the long short-term memory. GRU is the gated recurrent unit. GFF-Tansformer is the transformer model integrated with the gated feature fusion module. R2 is the coefficient of determination. MAE is the mean absolute error. RMSE is the root mean square error.

Tab.2 presents the humidity prediction results for prediction horizons of 6, 12, 18 and 24 h using various models. Consistent with temperature predictions, the accuracy of humidity predictions ranks from highest to lowest: GFF-transformer, LSTM, GRU and transformer models. Notably, the GFF-transformer model outperformed the transformer model with an R2 improvement of 0.03, 0.04, 0.08 and 0.08 for the 6, 12, 18 and 24 h predictions, respectively. Its MAE was lower by 0.81%, 0.99%, 1.37% and 1.43%, and its RMSE lower by 0.76%, 0.97%, 1.7% and 1.74%, respectively. These results highlight the superior performance of GFF-transformer model in both temperature and humidity predictions. The GRU model shows higher prediction errors compared to the LSTM model for both temperature and humidity, likely due to its simpler structure and fewer parameters. Also, as the prediction horizon increases, all models show increased prediction errors for humidity, reflecting a similar trend observed in temperature predictions.
Tab.2 Humidity prediction results using different prediction steps and models
Prediction horizons Metrics LSTM GRU Transformer GFF-transformer
6 R2 0.92 0.91 0.91 0.94
MAE (%) 3.26 3.79 3.72 2.91
RMSE (%) 4.80 5.19 5.11 4.25
12 R2 0.88 0.87 0.86 0.90
MAE (%) 4.35 4.50 4.77 3.78
RMSE (%) 6.01 6.31 6.39 5.42
18 R2 0.85 0.83 0.80 0.88
MAE (%) 4.92 5.26 5.73 4.37
RMSE (%) 6.78 7.18 7.77 6.07
24 R2 0.82 0.80 0.78 0.86
MAE (%) 5.21 5.79 6.03 4.70
RMSE (%) 7.29 7.63 8.14 6.40

Note: LSTM is the long short-term memory. GRU is the gated recurrent unit. GFF-transformer is the transformer model integrated with the gated feature fusion module. R2 is the coefficient of determination. MAE is the mean absolute error. RMSE is the root mean square error.

Tab.3 presents the training and testing runtimes for the LSTM, GRU, transformer and GFF-transformer models. These results show that the transformer model took longer to run than the LSTM and GRU models. The GFF-transformer, which introduces a new parallel processing module to boost model performance, had higher computational complexity, leading to the longest runtime. Specifically, the GFF-transformer took 504 s (about 8.4 min) for training and 0.25 s for testing. While the runtime is longer, it is acceptable for high-precision predictions. Also, for the requirement of hourly updates in temperature and humidity forecasts, the GFF-transformer model is capable of meeting real-time operation demands.
Tab.3 Runtimes of proposed and compared models
Time LSTM GRU Transformer GFF-transformer
Training (s) 265 254 410 504
Testing (s) 0.14 0.08 0.21 0.25

3.2 Prediction error of models

Fig.7 and Fig.8 illustrate the error distributions for temperature and humidity predictions made by the LSTM, GRU, transformer and GFF-transformer models. In Fig.7 and Fig.8, the y-axis is centered at 0, where smaller deviations from this baseline indicate higher prediction accuracy. The white symbols indicate the median of the error, the boxes show the interquartile range, and the whiskers cover 99.3% of the data. Fig.7 and Fig.8 show that the GFF-transformer model had a mean prediction error close to 0. For temperature predictions at prediction horizons of 6, 12, 18 and 24 h, the error ranges are [–1.28, 1.44], [–1.49, 1.66], [–1.72, 1.72] and [–1.71, 1.84], respectively. For humidity predictions at prediction horizons of 6, 12, 18 and 24 h, the error ranges were broader at the peaks, with values of [–8, 7.35], [–10.59, 10.42], [–12.65, 10.66] and [–14.12, 13.59]. However, within the interquartile range, the errors were much more confined, specifically [–2.23, 1.62], [–2.71, 2.52], [–3.91, 1.92] and [–3.73, 3.2]. These results indicate that the GFF-transformer model achieves a more concentrated and stable error distribution for both temperature and humidity predictions.
Fig.7 Error (εT) results of temperature prediction by different models of (a) LSTM, (b) GRU, (c) transformer, and (d) GFF-transformer.

Full size|PPT slide

Fig.8 Error (εH) results of humidity prediction by different models of (a) LSTM, (b) GRU, (c) transformer, and (d) GFF-transformer.

Full size|PPT slide

Compared to the transformer model (Fig.7 and Fig.8), the GFF-transformer model (Fig.7 and Fig.8) gave prediction errors that were closer to the baseline and more densely clustered, with lower peak errors. This indicates that the GFF module enhances the ability of the transformer to capture multiscale temporal features, leading to reduced prediction errors. Similarly, the GFF-transformer model demonstrates lower peak errors and a more concentrated error distribution compared to the GRU model (Fig.7 and Fig.8). When compared to the LSTM model (Fig.7 and Fig.8), the GFF-transformer model had slightly higher prediction values at the 6-h temperature and the 24-h humidity prediction horizons. However, the errors of GFF-transformer model were more densely clustered near the baseline, resulting in higher overall accuracy. For other prediction horizons, the GFF-transformer model consistently exhibits lower peak errors than the LSTM model. These findings demonstrate the effectiveness GFF-transformer model in multistep temperature and humidity predicting in poultry houses.

3.3 Visual analysis of prediction results

Given that multistep prediction results cannot be adequately visualized in a single plot, we analyzed the prediction results at the final prediction step of each horizon. Fig.9 compares the true temperatures from the test set with the prediction results made by different models. The GFF-transformer and LSTM models deliver predictions closer to the actual temperature than the GRU and transformer models. In Fig.9, the GFF-transformer and LSTM predictions are nearly identical. However, in Fig.9, as the prediction horizon increases, the prediction error of the LSTM model increases significantly, while the GFF-transformer model remains closer to the actual temperature. The GFF-transformer model fits well during periods of daily temperature rise and fall, but deviations occur at temperature turning points. Specifically, it tends to underpredict peak temperatures and overpredict the lowest temperatures, indicating that the GFF-transformer model smooths the prediction trend, which reduces accuracy when handling temperature peaks or inflection points.
Fig.9 Multistep temperature prediction results of different models of (a) 6-step, (b) 12-step, (c) 18-step, and (d) 24-step prediction results.

Full size|PPT slide

Fig.10 provides a comparison the actual humidity in the test set with the predictions of different models. Humidity exhibits more pronounced fluctuations than temperature, making it harder to predict. Consequently, all models had lower accuracy in humidity predicting compared to temperature. In Fig.10, the GFF-transformer model had significant deviation between data points 600 and 800, with all predictions skewing too high. However, in Fig.10, the model gave a better fit in that data range, though performance was weaker in other data range. In Fig.10, between data points 800 and 1200, the LSTM model provides a better fit than the GFF-transformer model. However, outside this range, the GFF-transformer model still outperforms. Despite increasing prediction errors as the prediction horizon extends, the GFF-transformer model consistently had improved performance compared to the baseline models across different time steps.
Fig.10 Multistep humidity prediction results of different models of (a) 6-step, (b) 12-step, (c) 18-step, and (d) 24-step prediction results.

Full size|PPT slide

3.4 Shapley additive explanations analysis

The data presented above have established the performance of the GFF-transformer model in forecasting temperature and humidity. However, the black-box nature of deep learning models complicates the direct understanding of the internal processes and prediction mechanisms. To address this, we used Shapley additive explanations (SHAP), a method grounded in Shapley value theory from game theory[25], to enhance model transparency. Fig.11 shows the importance ranking of different features in temperature and humidity prediction. For temperature prediction, SHAP analysis revealed that temperature data are the most significant influencing factor, as historical temperature data often provide an effective prediction of future temperature changes. Other variables, including CO2, humidity and air pressure, were also important, indicating complex interactions between these factors and temperature. Similarly, for humidity predictions, humidity data are the dominant feature, with its SHAP value notably higher than others. CO2 is again a key factor, reinforcing its indirect influence on the temperature and humidity conditions within poultry houses.
Fig.11 Importance of input features for temperature and humidity prediction of (a) Temperature and (b) humidity predictions.

Full size|PPT slide

3.5 Limitations

The GFF-transformer model presented in this study performed effectively in predicting future temperature and humidity within poultry houses. However, certain limitations remain. The data set used comprised hourly averages from identical sensors. In practice, temperature and humidity in poultry houses are not uniformly distributed. For example, under high ventilation, asymmetric external wind pressure can disrupt internal airflow circulation, resulting in significant differences in thermal and humidity conditions between the front and rear of the house[26]. Relying solely on averages will not fully capture the actual environmental complexities. Future research should focus on developing predictive models that incorporate environmental data from various points within the house. In this study, predictions for future temperature and humidity were made using only past 24-h data on temperature, humidity, CO2 and air pressure. However, several factors influence temperature and humidity in poultry houses. For example, increasing airflow is a crucial strategy for alleviating heat stress in chickens, and many researchers are investigating how to resolve issues such as uneven airflow and low wind speeds to maintain a comfortable thermal environment[27,28]. Environmental control strategies in enclosed poultry houses often aim to counteract external challenges, but energy constraints may prevent maintaining a consistently comfortable thermal environment. In colder regions, poultry houses tend to have lower internal temperatures[29], while in hotter regions, internal temperatures are normally higher. To improve prediction accuracy, future research should incorporate more environmental variables and install a wider variety of sensors inside and outside the poultry house. This will enable more precise predicting of temperature and humidity.

4 Conclusions

This study introduces a GFF-transformer model for multistep prediction of temperature and humidity in poultry houses, demonstrating good performance. From this work we draw three main conclusions. Firstly, the GFF module improves the ability of the transformer to capture multiscale features and process time-series data, which reduces prediction errors. Secondly, when compared to LSTM, GRU, and transformer models, the GFF-transformer outperforms them in both temperature and humidity predictions. For prediction horizons of 6, 12, 18 and 24 h of the test set, the model achieved a temperature prediction R2 between 0.88 and 0.92, with a MAE of 0.48 to 0.62 °C and a RMSE of 0.68 to 0.85 °C. For humidity prediction, the R2 ranged from 0.94 to 0.86, with an MAE of 2.9% to 4.7% and an RMSE of 4.3% to 6.4%. Thirdly, the GFF-transformer model also provided a more favorable error distribution across different prediction horizons compared to other models on the test set. In summary, the proposed GFF-transformer model achieved high accuracy in multistep temperature and humidity predictions in poultry houses. This model offers a theoretical foundation for precise environmental control and early warning systems, contributing to enhanced poultry production efficiency.

References

[1]
Yang N. Egg production in China: current status and outlook. Frontiers of Agricultural Science and Engineering, 2021, 8(1): 25–34
CrossRef Google scholar
[2]
Dawkins M S, Donnelly C A, Jones T A. Chicken welfare is influenced more by housing conditions than by stocking density. Nature, 2004, 427(6972): 342–344
CrossRef Google scholar
[3]
Reece F N, Deaton J W, Kubena L F. Effects of high temperature and humidity on heat prostration of broiler chickens. Poultry Science, 1972, 51(6): 2021–2025
CrossRef Google scholar
[4]
Borges S A, Silva A D, Majorka A, Hooge D M, Cummings K R. Physiological responses of broiler chickens to heat stress and dietary electrolyte balance (sodium plus potassium minus chloride, milliequivalents per kilogram). Poultry Science, 2004, 83(9): 1551–1558
CrossRef Google scholar
[5]
Dennis M J. The effects of temperture and humidity on some animal diseases—A review. British Veterinary Journal, 1986, 142(5): 472–485
CrossRef Google scholar
[6]
Kocaman B, Esenbuga N, Yildiz A, Lacin E, Macit M. Effect of environmental conditions in poultry houses on the performance of laying hens. International Journal of Poultry Science, 2006, 5(1): 26–30
[7]
Berckmans D. General introduction to precision livestock farming. Animal Frontiers, 2017, 7(1): 6–11
CrossRef Google scholar
[8]
Taleb H M, Mahrose M K, Abdel-Halim A A, Kasem H, Ramadan G S, Fouad A M, Khafaga A F, Khalifa N E, Kamal M, Salem H M, Alqhtani H A, Swelum A A, Arczewska-Włosek A, Świątkiewicz S, Abd El-Hack M E. Using artificial intelligence to improve poultry productivity—A review. Annals of Animal Science, 2024 [Published Online] doi:10.2478/aoas-2024-0039
[9]
Xie Q J, Ni J Q, Bao J, Su Z B. A thermal environmental model for indoor air temperature prediction and energy consumption in pig building. Building and Environment, 2019, 161: 106238
CrossRef Google scholar
[10]
Mirzaee-Ghaleh E, Omid M, Keyhani A, Dalvand M J. Comparison of fuzzy and on/off controllers for winter season indoor climate management in a model poultry house. Computers and Electronics in Agriculture, 2015, 110: 187–195
CrossRef Google scholar
[11]
Wang Y, Zheng W, Li B. Application of a novel grey model for forecasting indoor air temperature in poultry houses: model development. Journal of the ASABE, 2022, 65(4): 681–693
CrossRef Google scholar
[12]
Wang Y, Zheng W C, Li B M. A modified discrete grey model with improved prediction performance for indoor air temperatures in laying hen houses. Biosystems Engineering, 2022, 223(3): 138–148
CrossRef Google scholar
[13]
Xu Z Y, Zou X G, Yin Z J, Zhang S K, Song Y Y, Zhang J, Lu J X. Prediction model of ammonia concentration in yellow-feather broilers house during winter based on EEMD-GRU. INMATEH-Agricultural Engineering, 2020, 61(2): 59–70
CrossRef Google scholar
[14]
Lim B, Zohren S. Time-series forecasting with deep learning: a survey. Philsosphical Transactions of the Royal Society A-Mathematical Physical and Engingeering Sciences, 2021, 379(2194): 20200209
[15]
Chen Z L, Ma M B, Li T R, Wang H J, Li C S. Long sequence time-series forecasting with deep learning: a survey. Information Fusion, 2023, 97: 101819
CrossRef Google scholar
[16]
Dai X L, Liu J J, Li Y L. A recurrent neural network using historical data to predict time series indoor PM2.5 concentrations for residential buildings. Indoor Air, 2021, 31(4): 1228–1237
CrossRef Google scholar
[17]
Sekertekin A, Bilgili M, Arslan N, Yildirim A, Celebi K, Ozbek A. Short-term air temperature prediction by adaptive neuro-fuzzy inference system (ANFIS) and long short-term memory (LSTM) network. Meteorology and Atmospheric Physics, 2021, 133(3): 943–959
CrossRef Google scholar
[18]
He Z H, Jiang T C, Jiang Y, Luo Q, Chen S, Gong K Y, He L, Feng H, Yu Q, Tan F Y, He J Q. Gated recurrent unit models outperform other Machine learning models in prediction of minimum temperature in greenhouse Based on local weather data. Computers and Electronics in Agriculture, 2022, 202: 107416
CrossRef Google scholar
[19]
Shen L, Wang Y Z. TCCT: tightly-coupled convolutional transformer on time series forecasting. Neurocomputing, 2022, 480: 131–145
CrossRef Google scholar
[20]
Wu N, Green B, Ben X, O’Banion S. Deep transformer models for time series forecasting: the influenza prevalence case. 2020, Arxiv Preprint Arxiv: 2001.08317
[21]
Liu Y, Sun G L, Qiu Y, Zhang L, Chhatkuli A, Van Gool L. Transformer in Convolutional Neural Networks. 2021, Arxiv Preprint Arxiv: 2106.03180
[22]
Chen Y, Chen X, Xu A, Sun Q, Peng X. A hybrid CNN-transformer model for ozone concentration prediction. Air Quality, Atmosphere & Health, 2022, 15(9): 1533–1546
CrossRef Google scholar
[23]
Szegedy C, Liu W, Jia Y Q, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A. Going Deeper with Convolutions. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Boston: USA, 2015, 1–9
[24]
Hochreiter S, Schmidhuber J. Long short-term memory. Neural Computation, 1997, 9(8): 1735–1780
CrossRef Google scholar
[25]
Lundberg S M, Lee S I. A unified approach to interpreting model predictions. 31st Conference on Neural Information Processing Systems. Long Beach, CA: USA, 2017, 4768–4777
[26]
Van Limbergen T, Sarrazin S, Chantziaras I, Dewulf J, Ducatelle R, Kyriazakis I, Mcmullin P, Mendez J, Niemi J K, Papasolomontos S, Szeleszczuk P, Van Erum J, Maes D. Risk factors for poor health and performance in European broiler production systems. BMC Veterinary Research, 2020, 16(1): 1–13
CrossRef Google scholar
[27]
Chai L, Ni J Q, Diehl C A, Kilic I, Heber A J, Chen Y, Cortus E L, Bogan B W, Lim T T, Ramirez-Dorronsoro J C, Chen L. Ventilation rates in large commercial layer hen houses with two-year continuous monitoring. British Poultry Science, 2012, 53(1): 19–31
CrossRef Google scholar
[28]
Cheng Q, Li H, Rong L, Feng X L, Zhang G J, Li B M. Using CFD to assess the influence of ceiling deflector design on airflow distribution in hen house with tunnel ventilation. Computers and Electronics in Agriculture, 2018, 151: 165–174
CrossRef Google scholar
[29]
Wang Y, Li B M, Liang C, Zheng W C. Dynamic simulation of thermal load and energy efficiency in poultry buildings in the cold zone of China. Computers and Electronics in Agriculture, 2020, 168(C): 105127
CrossRef Google scholar

Acknowledgements

This research was supported by National Key R&D Program of China (2023YFD2000805).

Compliance with ethics guidelines

Hengyi Ji and Guanghui Teng declare that they have no conflicts of interest or financial conflicts to disclose. All applicable institutional and national guidelines for the care and use of animals were followed.

RIGHTS & PERMISSIONS

The Author(s) 2025. Published by Higher Education Press. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0)
AI Summary AI Mindmap
PDF(3830 KB)

Accesses

Citations

Detail

Sections
Recommended

/