Discussion Closed This discussion was created more than 6 months ago and has been closed. To start a new discussion with a link back to this one, click here.
Error using Global ODEs: Undefined variable minput.T
Posted Feb 21, 2024, 12:25 p.m. EST Materials, Modeling Tools & Definitions, Studies & Solvers Version 6.1 0 Replies
Please login with a confirmed email address before reporting spam
Dear all,
my principle setup consists of the Heat Transfer in Solids and Liquids coupled with Laminar Flow, using stationary solver. As inlet flow boundary condition, I use average fluid velocity. I need to calculate this velocity from an analytic equation that depends on the simulated pressure drop across my geometry. To account for this implicit problem, I have set up a time-independent ODE using the Global ODEs and DAEs node.
In that ODE, I call the dynamic viscosity of a material (mat1.def.dynamicviscosity) that I have defined using Thermodynamics node as a mixture of two liquids. The dynamicviscosity in turn is defined by the expression Viscositypp1(T, pA, xw1, xw2).
Now, when I compute the study, I get the following error: Failed to evaluate initial residual for segregated step 1. Undefined variable. - Variable: minput.T - Global scope Failed to evaluate variable. - Variable: comp1.pp1mat1.def.mu
It seems that the global ODE node does not recognize the default model input temperature minput.T that is required to calculate the liquid's dynamic viscosity. The problem remains even if I use a standard material like water.
I have struggled with this problem for days now but don't know any solution. Is there anything special regarding the computational order of global ODE? Thank you in advance for any help!
Hello Felix
Your Discussion has gone 30 days without a reply. If you still need help with COMSOL and have an on-subscription license, please visit our Support Center for help.
If you do not hold an on-subscription license, you may find an answer in another Discussion or in the Knowledge Base.