This issue is a fundamental limitation and not a bug.
The following Web site provides information about the "curse of dimensionality":
Question 5 in this page, titled "How come GENFIS1 generates a huge number of fuzzy rules? What is the 'curse of dimensionality?'" is relevant for this issue.
One way to reduce the number of rules generated for a large dataset is to use the GENFIS2 method could be like:
in = train_data(:,1:37);
out = train_data(:,38);
fismat = genfis2(in,out,0.8);
fis = anfis(train_data,fismat);
Additionally, for large datasets, it is not recommend that we execute multiple trainings in a loop. To manage memory usage, I suggest using a separate script file for each training and executing "clear classes" at the MATLAB prompt between executing each of the scripts.