Preprint / Version 1

Ultra-Compact Geometry-Aware Transformers for Airfoil Polar Prediction

FoilForm

##article.authors##

DOI:

https://doi.org/10.31224/6870

Keywords:

Efficient transformer, Airfoil

Abstract

We present FoilForm, a two-stage neural surrogate for predicting aerodynamic lift and drag coefficients (Cl, Cd) directly from airfoil contour geometry across angle-of-attack sweeps. The first stage is an ultra-compact autoregressive transformer (4,470 parameters) with a novel pairwise outer-product attention mechanism that captures second-order geometric feature interactions. The second stage is a lightweight convolutional correction network (34,290 parameters) that refines predictions using the full-resolution contour. Trained on 1,768 airfoils from a compiled dataset of 2,946 profiles at Re = 105, the full pipeline achieves validation MAE of 0.040 on Cl and 0.0081 on Cd across 1,178 held-out airfoils, while running at 0.3 ms per airfoil in batched CPU inference. Ablation studies demonstrate that pairwise interactions, minimal dropout (p = 0.05), and 4-layer depth are critical design choices. We benchmark against NeuralFoil (xxxlarge) and show 9× lower Cl error and 19× lower L/D error on matched evaluation, while using 8.6× fewer parameters.

Downloads

Download data is not yet available.

Downloads

Posted

2026-04-20